Menu

Filter by
content
PONT Data&Privacy

0

How the DSA forces large platforms to be transparent about their algorithms

The advertisements and content that users see on online platforms often match their interests and previous search behavior seamlessly. This is the result of algorithmic profiling: the automated analysis of user behavior to make personalized recommendations. The Digital Services Act (DSA) requires online platforms to make the use of algorithms clear and transparent, among other things.

12 December 2025

Recently, the Amsterdam District Court ruled that Meta does not comply with these transparency obligations. It found that Instagram and Facebook users do not have sufficient freedom of choice to use a timeline that is not based on algorithmic profiling. Furthermore, the automatic switch back to a profiling recommendation system was classified as a misleading design choice (also known asa 'dark pattern'), which is prohibited under the DSA. X also recently violated its transparency obligations, among other things by providing insufficient insight into advertisements and by obstructing access to data, which prompted the European Commission ("EC") to impose afineof €120 million.

These examples emphasize that large online platforms, which are used by millions of people worldwide every day, can be held accountable for complying with their (transparency) obligations. In this blog, we discuss a number of obligations under the DSA for online platforms and search engines (so-called "intermediary services"), with a particular focus on (algorithmic) transparency. In addition to the DSA, the Digital Markets Act ("DMA") is also relevant, with a focus on competition in digital markets. For a broader introduction to the DSA and DMA, please refer to our previous blog series (see parts1,2, and3).

Algorithmic transparency

Online platforms such as Instagram, TikTok, and X base their services largely on user data: users provide content and interactions, after which (the underlying algorithmic systems of) online platforms determine how this content is displayed. Initially, content is shown from profiles that the user has subscribed to or follows. User profiles are then systematically built up based on various parameters, includingclicks,likes, viewing time, and search queries. The age of the account and location preferences are also personalization parameters. Based on this data analyzed by algorithms, the systems generate recommendations for each individual user and determine which video,post, or advertisement that user will see on the online platform. The longer a user is active, the better their user profile can be personalized.

In short, the DSA requires online platforms to be transparent about how their services work, including the use of underlying (algorithmic) systems. Among other things, platforms must provide users with insight into how advertisements are displayed, how profiling takes place, and how recommendation systems function. In addition, the DSA sets requirements for how this information is presented in the terms and conditions. The largest online platforms and search engines (Very Large Online Platforms("VLOPs") andVery Large Online Search Engines("VLOSEs")) are subject to additional obligations under the DSA. For example, when using algorithmic systems on their online platforms, they are required to assess the resulting systemic risks, particularly when these could have a significant societal impact, such as influencing elections or negatively affecting the mental health of young people. Identified risks must then be mitigated through appropriate measures, including adjustments to systems or terms and conditions.

The EC provides an overview of the (already) designated VLOPs and VLOSEs on itswebsite. VLOPs include Amazon, Apple, Meta, TikTok, X, and Zalando. Google and Microsoft are VLOSEs.

General conditions

Article 27(1) and (2) of the DSA requires online platforms to clearly explain in their terms and conditions which core parameters their algorithmic recommendation systems use, how users can influence these settings, and why certain parameters are relevant to specific users. In addition, Article 14(1) to (3) of the DSA stipulates that the terms and conditions of intermediary services must explain in a clear, understandable, and accessible manner (i) what rules and restrictions apply to user-provided content, (ii) how and on the basis of which mechanisms content is moderated, and (iii) what options users have to submit complaints about the functioning of the platform or compliance with the DSA. VLOPs and VLOSEs have the additional obligation to provide a concise, machine-readable summary of their terms and conditions (Article 14(5) DSA) and to make this available in all official languages of the EU Member States in which they operate (Article 14(6) DSA).

Although the terms and conditions of online platforms are intended to provide clarity for (business) users about the content on those platforms, in practice that transparency is still hard to find. A recent NRC article by Merve Özdemir on theterms and conditions of major platformsshows that users who want to know what online platforms and apps do with their data have to spend around eight working days reading terms and conditions that are often difficult to understand. For five of the most popular apps in the Netherlands (NS, Albert Heijn, Facebook, Messenger, and Instagram), it takes about 2.5 hours per app to read all the terms and conditions in full—the reason being that these apps want to fully inform their users. However, this jumble of information is at odds with the General Data Protection Regulation.

Gatekeepers under the DMA

Many designated VLOPs and VLOSEs, including Amazon, Apple, Meta, and Microsoft, have also been designated as "gatekeepers" under the DMA. Gatekeepers are digital platforms that, due to their substantial economic size and large user base, hold a sustainable and dominant position in the provision of core platform services within the EU. The DMA aims to limit the dominant position of gatekeepers and ensure fair competition. For example, the DMA contains obligations that directly affect the use of data and algorithms, such as restrictions on the combination and reuse of personal data for advertising. Furthermore, gatekeepers may only offer personalized services when users have a genuine, effective choice between personalized and less personalized alternatives. This choice must be presented in a neutral and non-manipulative manner, which the DMA explicitly links to the broader DSA standard against misleading design choices (dark patterns).

Supervision

In the Netherlands, the Netherlands Authority for Consumers and Markets (ACM) and the Dutch Autoriteit Persoonsgegevens AP), together with the EC, supervise compliance with the DSA and the DMA.

Even before the DSA came into force, ACM called on large online platforms to improve their information provision and general terms and conditions, including Airbnb, AliExpress, Apple, Bol.com, Booking.com, Expedia, Facebook, and Google. However, the primary supervisory task for compliance with the DSA by VLOPs and VLOSEs lies with the EC. The EC actively enforces the DSA: since 2023, the EC has been initiating proceedings against VLOPs and VLOSEs on an almost monthly basis regarding compliance with the DSA. On October 31, 2024, the EC initiatedproceedingsfor the first time—and, as far as is known, the only time to date—regarding the operation of recommendation systems against Temu.

As demonstrated by the multi-million dollar fine imposed on X, the EC's sanctioning powers are considerable. In the event of a breach of DSA obligations, it can impose fines of up to 6% of global annual turnover and periodic penalties of up to 5% of average daily global turnover for each day of delay in complying with measures or commitments. If a serious infringement persists and poses risks to life or safety, the EC may, after following a formal procedure, request the temporary suspension of the service. TheEC's websiteprovides an overview of enforcement actions against online platforms, including those based on the DSA.            

In conclusion

Thanks to their enormous user data, online platforms have a dominant position in digital markets, enabling them to control the experience of individual users through algorithmic systems. The DSA ensures that this power does not remain unchecked: both regulators and users can hold platforms accountable for their transparency obligations and enforce compliance, for example through civil court proceedings or sanctions imposed by the EC.

Algorithmic transparency is not a voluntary effort, but an enforceable standard. Online platforms have a clear responsibility to explain how their algorithmic systems work, offer users genuine choices, and prevent these choices from being undermined by misleading design choices or complex conditions. For VLOPs and VLOSEs, this means that they must not only develop robust algorithms, but also invest in understandable information, accessible settings, and a design that respects user autonomy. Anyone who considers transparency to be merely a checkbox in the terms and conditions runs a real enforcement risk under the DSA.

AKD

Share article

Comments

Leave a comment

You must be logged in to post a comment.