Menu

Filter by
content
PONT Data&Privacy

0

The Digital Service Act: ground rules for a more transparent and secure digital ecosystem

The digital world has changed dramatically in recent decades, while legislation regulating the Internet lagged behind for a long time. This changed on Oct. 4, 2022, when the European Council adopted the final version of the Digital Services Act ("DSA"). Together with the Digital Markets Act ("DMA"), the DSA regulates, among other things, social media platforms, search engines, online marketplaces, Internet service providers and web hosting services operating within the European Union ("EU"). The DSA aims to more effectively tackle the distribution of illegal online content (or "information") through such "brokering services. In particular, the DMA is concerned with ensuring fair competition in the online services market.

January 20, 2025

 
The DSA is part of the European Digital Strategy, Shaping Europe's Digital Future. This strategy fits into a broader landscape of laws and regulations surrounding a safer online world, which also includes the AI Act, NIS2 Directive, Cyber Resilience Act, DMA, Data Act, Data Governance Act and the General Data Protection Regulation include.
This blog is part one of our three-part blog series on the DSA and provides an initial introduction.
 

Scope and purpose

The DSA has applied since Feb. 17, 2024, to digital service providers who are user generated content transmission, storage or disclosure. Such service providers, the brokering services, include (among others) (video) platforms, marketplaces, social networks, search engines, cloud providers, Internet service providers and hosting services.
 
The DSA aims to create a more secure and transparent online ecosystem, where users are better protected and digital service providers are given more and clearer responsibilities. To this end, the DSA includes rules on:
 
  • the liability of brokering services;
  • reporting and action procedures;
  • content moderation practices;
  • Online advertising, profiling and targeting;
  • The use of algorithms and recommendation systems;
  • the traceability of traders;
  • the systemic risks of Very Large Online Platforms ("VLOPs") and Very Large Online Search Engines ("VLOSEs"), in addition to the introduction of a new oversight mechanism.
  The 19 largest platforms and search engines - including Apple, Google, Meta, X, AliExpress, Snapchat and booking.com - have been subject to stricter obligations since Aug. 25, 2023. These platforms with more than 45 million monthly active users within the EU, fall under the definition of VLOPs and VLOSEs and are subject to the DSA's toughest obligations.
 

Who does the DSA apply to and what does it mean in practice?

All brokering services offered in the EU must comply with the DSA. This is the case when a brokering service has an establishment in the EU, has a significant number of customers in a member state in proportion to its population, or directs activities specifically to the EU. The mere fact that a website is accessible from the EU is not considered sufficient for this purpose. The responsibilities and obligations of brokering services within the DSA vary depending on their role, size and impact within the online ecosystem. The DSA takes a structured approach and applies to three categories of brokering services: mere conduit-, caching- and hostingservices, as defined in the Electronic Commerce Directive.
  • Mere conduit services see to the transmission of content;
  • Caching services refer to the temporary storage of content for the purpose of transmission;
  • Hosting services involve storing content provided by end users as well as distributing that content to the public.

Platform obligations and responsibilities

Among the obligations under the DSA are transparency and accountability obligations about how platforms moderate and control (illegal) content, advertising and algorithmic processes. The DSA has a tiered structure of obligations, similar to a pyramid:
 
  • the broad bottom layer of the "pyramid" contains the general obligations that apply to all brokering services;
  • the second layer contains additional obligations for hosting services;
  • the third layer focuses specifically on online platforms;
  • the top of the pyramid includes the most stringent obligations, which apply only to VLOPs and VLOSEs.
 
Such online platforms must adhere to obligations from all layers of the pyramid.
 
Liability of service providers
 
The DSA should be seen as a supplement to the (more general) Electronic Commerce Directive. Providers of mere conduit, caching and hosting services can (still) rely under the DSA on the liability exemptions set out in Articles 12 to 15 of the E-Commerce Directive. This so-called "safe harbor regime" provides that intermediaries are not liable for illegal content distributed through their services if they meet a number of conditions. This regime is fully incorporated in Articles 3 to 5 of the DSA.
 
Within this regime:
 
  • Mere conduits services are not liable for the information transmitted as long as they have no control over the initiative, recipient or content of the information.
  • Coaching services are exempt from liability for the temporary storage of information as long as this storage serves only to make transmission more efficient and legal sectoral conditions are met.
  • Hosting services are not liable for content uploaded by users, as long as they are unaware of illegal content and act quickly on reports as soon as he learns of it.
  Transparency in procedure
 
The DSA further contains a number of due diligence obligations, particularly aimed at protecting consumers. For example, the DSA first requires intermediary services to clarify in advance the means that can be used to monitor and enforce compliance. Online platforms are prohibited from designing and organizing their websites and/or apps in such a way that these online interfaces contain dark patterns that mislead and manipulate users. The DSA also provides rules for VLOPs that use automated, AI-based (recommendation) systems to bring certain content to the attention of users. A similar obligation applies around advertising; online platforms must indicate on whose behalf and why advertising is shown.
 
Part 2 of this blog series, "Liability and Algorithms in the DSA," takes a closer look at the transparency obligations regarding AI for brokering services.
 
Content moderation
 
The DSA aims to give digital service providers greater responsibility in monitoring content, particularly where it involves the posting of erroneous information that may influence public opinion or hate speech. Content moderation, also known as "content moderation," is defined in Article 2(p) of the DSA and includes the activities of intermediary services aimed at detecting, identifying and addressing content shared by users, including actions such as removal, restricted visibility or blocking of user accounts. Possible sanctions include:
 
  • removal of content;
  • shadow ban;
  • down ranking;
  • demotion;
  • demonetization; and
  • (vi) account blocking.
 
Brokering services should clearly communicate their policies around content moderation and (algorithmic) procedures, including through the terms and conditions and in annual reports. Specifically, hosting services must provide their users with notification and action mechanisms that allow them to report the presence of alleged illegal content to the hosting service. Once hosting services decide to remove content, users must be given reasoned notice. For online platforms, this is in addition to prioritizing reports from "trusted flaggers" - entities with specific expertise and objectivity.
 
The above obligations require action by brokering services once a sufficiently accurate report of illegal content has been made. In addition, the DSA contains obligations that address proactive action by intermediary services in relation to illegal content. For example, online platforms must suspend users who abuse their services by repeatedly providing illegal content and hosting services must report suspicions of certain offenses to the competent authorities.
 
Part 3 of this blog series, "Unlawful Content Under the DSA," will delve deeper into the (transparency) obligations and responsibilities surrounding content moderation.
 

Monitoring and compliance

DSA oversight is organized at both the national and European levels. The European Commission is responsible for monitoring and enforcing compliance exclusively with VLOPs and VLOSEs. Member States are responsible for monitoring compliance of other brokering services. The Consumer and Market Authority has been designated as digital services coordinator for the application of the DSA and as supervisor of most of the regulation. The Personal Data Authority supervises the provisions dealing with the processing of personal data.
 

Conclusion

The DSA strengthens the regulation of digital service providers in the EU, with the aim of creating a safer and more transparent online environment. Through specific obligations for brokering services and strict requirements for VLOPs and VLOSEs, the DSA provides Internet users with greater protection and establishes clear responsibilities. Specific topics such as liability and content moderation will be covered in detail in the next parts of this blog series. Keep an eye on our blog: part two, in which we discuss liability, will appear soon.
AKD

Share article

Comments

Leave a comment

You must be logged in to post a comment.