Online platforms, such as Instagram and X, influence the content (or "content") displayed and shared by users through their services in various ways, including by making content invisible or removing it. This practice, known as "content moderation," as defined in Article 2t DSA, is a notable and often controversial part of platform management. Content moderation has profound implications, especially for freedom of expression, and represents a battleground where platform services, governments and users seek to take control of the digital infrastructure.
The Digital Services Act ("DSA") plays a crucial role in this regard. Part of the broader Digital Services Act Package, the DSA is designed to strengthen the online marketplace and set clear rules for intermediary services such as video platforms, social media and online marketplaces. In part I of this blog series, we discussed the basic rules of the DSA and how they contribute to a more transparent and secure digital ecosystem. Then, in part II explored transparency obligations and the role of algorithms within online platforms. This third and final part of our blog series on the DSA covers content moderation, transparency and accountability obligations.
Measures and enforcement of damaging content
Online platforms display content (exclusively) shared by users, also known as user-generated content.
On platforms, such as Instagram and TikTok, unwanted or illegal content is often posted. Illegal content refers to hate speech, terrorist propaganda and infringement of intellectual property rights. Unwanted content can include spam, clickbait, hate speech, disinformation and shocking, violent images.
Content moderation is the enforcement of applicable rules by intermediary services, especially online platforms, regarding the information shared by their users. 'Enforcement' includes both the detection or classification of harmful or illegal content, as well as the application of measures. Possible measures include:
its removal,
a shadow ban, making content less visible,
down ranking, causing content to be ranked lower,
demotion,
demonetization and
account blocking, as defined in Article 2t DSA.
But also (i) password protecting information, (ii) blocking a website or IP address, (iii) making it impossible to share or respond to content, and (iv) providing (name and address) data are forms of content moderation.
The DSA lays out various measures to enforce harmful or illegal content. Enforcement is both proactive and reactive: on the one hand, unwanted content can be reported by users, and on the other hand, intermediary services can act on their own accord against certain content. Hosting services are required to carefully handle reports of illegal content made by users through so-callednotice-and-actionmechanisms, as prescribed in Article 16 DSA. Once a hosting service proceeds to moderate reported content, that service must provide reasons for its decision to do so, in accordance with Article 17 DSA. Hosting services are subject to an obligation to report to the relevant authorities in case of (suspected) serious crimes, as described in Article 18 DSA.
Additional obligations apply to platforms, as platforms not only store content but also distribute it to the public. Platforms must provide an internal complaint handling system through which to complain against content moderation decisions, as stipulated in Article 20 DSA, and they must cooperate in resolving disputes out of court, as required in Article 21 DSA. Platforms must also treat reports from designated trusted flaggers with priority, as required by Article 22 DSA. In serious cases, such as frequent abuse, a user may be temporarily suspended, with guarantees of due process, as outlined in Article 23 DSA.
The largest platforms and search engines ("VLOPs" and "VLOSEs") are subject to a heavier duty of care through obligations to conduct risk assessments and mitigation measures in the context of systemic risks, as stipulated in Article 34 DSA. These are risks that have the potential to harm the entire society or economy, such as the distribution of illegal content, impact on fundamental rights, the democratic process, public health and minors, as further specified in Article 35 DSA.
Transparency obligations related to content moderation
Under the DSA, platforms will be required to make their content moderation processes more transparent. This means that platforms such as META and YouTube must not only state what content they moderate, but also justify their decisions. With banning certain content posted online for inaccuracy or deception difficult to measure against freedom of expression, under the DSA there are strict transparency obligations requirements on how platforms moderate and direct content.
As discussed above, under the DSA, content can be addressed or removed following a notification or order from a competent authority, a complaint from a user(notice-and-action) or through proactive action by the brokering service itself. Upon an order from a competent authority to address illegal content posted on a platform, brokering services are obliged to immediately notify the authority of the action taken, as stipulated in Article 8 DSA. This duty to notify also applies to orders to provide information about specific users, as described in Article 9 DSA. To streamline communication with authorities, brokering services must designate a central point of contact, as required in Article 10 DSA. Brokering services outside the European Union ("EU") are subject to the obligation to appoint a legal representative within the EU, as specified in Article 11 DSA.
The DSA forces platforms not only to respond to incidents surrounding illegal or abusive content, but also to take preventive measures. For example, brokering services must provide clear information in their terms and conditions about content moderation policies, procedures and tools, including whether algorithmic decision-making or human procedures are used, as required by Article 12 DSA.
Once brokering services take action against illegal or harmful content, users have the right to transparent communication about it and the opportunity to challenge actions. Moreover, brokering services must report annually (i) how many orders, notifications and complaints have been received, (ii) how many accounts have been blocked or deleted, and (iii) any use of automated means, as stipulated in Article 13 AVG. Brokerage services must enter their justification statements, explaining the reasons behind certain content moderation, into the DSA Transparency Database.
Finally, in the context of transparency, brokering services should clearly communicate how and which AI systems are being used to prevent or address harmful or illegal content.
Liability
The premise of content moderation is policy freedom for platforms. This freedom is constrained by fundamental rights that play a mandatory role (think liability) or a restrictive role (think duties of care). In the context of liability, the regime from Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market still applies under which providers of mere conduit, caching and hosting servicesare not held liable for the information they transmit or store on behalf of their users under certain conditions. Those conditions depend on the type of service. For example, conditions for mere conduit servicesare limited, while hosting service providers must take immediate action against illegal content as soon as they become aware of its presence.
Limiting the liability risks of brokering services prevents precautionary removal of content in case of doubt. At the same time, brokering services are not unfairly burdened in this way, while victims of harmful or illegal content retain their ability to recover damages.
As discussed above, the DSA requires brokering services to proactively moderate content. This means they must not only wait for notifications, but also deploy algorithmic systems to automatically detect and remove deepfakes, for example, before they spread further. The "good samaritan" clause protects brokering services that voluntarily and on their own initiative moderate harmful or illegal content from losing liability exemptions, as long as it is done carefully and in good faith, as described in Article 6 DSA. This prevents brokering services from being discouraged from acting on such content.
The DSA is introducing a new regulation for a specific subcategory of hosting service providers, namely online platforms that allow merchants to enter into distance contracts with consumers. These are B2C(business-to-consumer) online marketplaces, such as Amazon or Bol.com. If such providers give consumers the impression that the information or the product or service offered comes from them - and thus not from the merchant in question - they cannot rely on the hosting services liability exemption in relation to consumer law liability. Whether there is such an impression is assessed on the yardstick of an "average consumer.
Case law makes it clear that obligations on providers to scan all data traffic, indefinitely and at their own expense, in order to find and block non-specifically specified and even future illegal content, cannot be condoned. This was underlined in the case of the ECJ EU 24 November 2011, C-70/10, ECLI:EU:C:2011:711 (Scarlet Extended/SABAM), in which the Court ruled that such general and untargeted obligations are contrary to European law. On the other hand, an obligation to detect and remove specific information classified by a court as illegal content (and content corresponding to it) is permissible, as confirmed in ECJ EU 16 February 2012, C-360/10, ECLI:EU:C:2012:85 (SABAM/Netlog). In between, however, a considerable grey area remains, where it is sometimes difficult to make a clear distinction between "general" and permissible "special" surveillance obligations, as was also discussed in ECJ EU 3 October 2019, C-18/18, ECLI:EU:C:2019:821 (Facebook Ireland).
The DSA imposes obligations on intermediary services, particularly platforms, regarding how illegal content is handled. 'Illegal content' includes infringements of intellectual property rights. This includes, for example, copyright infringements (the unauthorized use of copyrighted material) and trademark infringements. Content moderation on platforms often occurs in the context of copyright when it involves, for example, uploading music and movies without permission. Counterfeit products are also frequently taken offline. The starting point is that in these cases, intermediary services are usually not liable for the "illegal content" posted or offered by their users, as described in Recital 12 DSA. From the moment they have conscious control or knowledge of infringing content, liability can be just around the corner, as confirmed in the ECJ EU case of June 22, 2021, C-682/18 and C-683/18, ECLI:EU:C:2021:503 (Youtube/Cyando).
Under the DSA, it will be easier for rights holders to enforce their intellectual property rights. Indeed, hosting services and online platforms must provide an easy notification mechanism by which individuals or entities can notify them of illegal content on their services. Upon receipt of such a notification, intermediary services are deemed to have actual knowledge of the existence of the specific content reported. If providers are notified of an infringement, they must carefully and objectively investigate it and take appropriate action.
Thereby, providers of illegal content usually do not operate under their real names or and no or false addresses are given. Furthermore, providers are sometimes located in countries where intellectual property rights are difficult to enforce. Through reporting systems, infringements can be more easily taken offline by rights holders. In addition, the DSA requires online marketplaces to make efforts to verify the reliability and completeness of their users' data. Thus, in the event of an infringement, the rights holder will also generally be more likely to be able to directly address the infringer.
The DSA provides a comprehensive framework for regulating brokering services, with a focus on content moderation, accountability and transparency. The DSA imposes obligations on intermediary services to act (pre-emptively) against harmful or illegal content. Through stricter regulation and transparency requirements, the DSA aims to reduce negative consequences of such content posted online and better protect users.