Menu

Filter by
content
PONT Data&Privacy

0

New ground rules for platform power and media protection with Article 18 EMFA

Online platforms such as X, Facebook and YouTube increasingly determine how we consume and distribute news. They give media companies access to a large audience, but also determine which content remains visible or disappears. In doing so, they gain a grip on public discourse. When platforms suppress or remove journalistic content, media freedom comes under pressure.

Lex Digitalis July 21, 2025

Articles

Articles
Id398261205ee146e8858757d306e0996b

To protect media from this influence, the European legislator sets rules: the European Media Freedom Act (EMFA). An important part of this is the so-called "media privilege," designed to allow recognized media, so-called media service providers (MSPs), provide additional protection against unwarranted content removal by large platforms. Consider a requirement for platforms to warn first and the ability for media to file defenses. This article opens a triptych on the implications of Article 18 EMFA for media freedom. The first article outlines the context in which media privilege arose: the increasing power of online platforms over public debate and the need for European protection of journalistic content. The second article then analyzes how the limited access to the privilege, which applies only to licensed media service providers (MSPs), relates to broader human rights standards and which actors are excluded as a result. Finally, the third article examines whether the rights for MSPs and obligations for platforms actually help protect media freedom in practice.

Platforms determine who gets heard

Online platforms have become central players in the media landscape, influencing how information is disseminated and how audiences interact with it.[1] Although platforms expand the reach of media content[2], moderation practices of platforms give them control over the visibility and dissemination of information.[3] The dual role of platforms as both gateway and gatekeeper[4] within the information system has implications for media freedom, which increasingly depends on a forum for public debate.[5]

European protection of media freedom

Freedom of expression, and more specifically media freedom - the ability of media outlets to report and disseminate information without censorship or undue interference - is protected under Article 11 of the EU Charter of Fundamental Rights (the Charter)[6], which corresponds to and has the same meaning as Article 10 of the European Convention on Human Rights[7] (hereinafter ECHR)[8], which includes freedom of expression and the resulting media freedom.[9] These protections may apply to individuals who disseminate information that may hurt, shock or distress.[10] Such information can sometimes conflict with platform policies, leading to reduced visibility of content and thus limiting media debate. This hampers the ability of media outlets to exercise their right to media freedom, which is of great importance given the topicality value of news.[11]

Twitter (but actually Musk) vs. journalistsk contributions to protecting media freedom in practice.

Platforms determine who gets heard

The tension between content moderation policies and media freedom was highlighted when Twitter suspended journalists' accounts for alleged violation of terms of use[12], which highlighted the importance of regulation to protect media freedom. In December 2022, Twitter, under the leadership of Elon Musk, suspended without warning the accounts of several journalists from the likes of The New York TimesCNN and The Washington Post. The reason: they allegedly linked to @ElonJet, an account that tracked the location of Musk's private jet using public data. Although many did not share exact location information, Musk accused them of doxxing. The action sparked widespread outrage, including from press freedom organizations and EU politicians, who spoke of a dangerous curtailment of journalistic freedom.

'Tweaking' recommendation algorithms

The tension between content moderation policies and media freedom is also reflected in the operation of recommendation systems and internal moderation guidelines on social media platforms. These systems and rules are not neutral tools, but reflect design choices, interests and norms of the platform. For example, in 2016, Facebook decided to scale back the influence of the "Angry" response on its recommendation algorithm because this response more often led to the spread of sensational and misleading content. Instead of removing content immediately, Facebook adjusted the definition of "relevance," making certain (news) posts less visible.

'Someone should shoot Trump'

Moreover, internal moderation guidelines from platforms have been leaked on several occasions, showing how complex considerations are reduced to simple rules of thumb. For example, Facebook moderators learned from leaked manuals that the statement "Someone should shoot Trump" should be treated as a violent threat, while "Let's beat up fat kids" was not. Other leaked documents showed that Facebook considered people to be public figures if they had been mentioned five times in the media, and that it had to escalate content such as maps of Kurdistan or criticism of Atatürk under pressure from Turkey. These examples illustrate how normative choices and external political pressures can permeate content moderation, with direct consequences for the visibility of journalistic or activist expressions.[13]

Abortion pill out of the picture

Finally, a recent example concerns the moderation of accounts of abortion pill providers on Facebook and Instagram.[14] In January 2025, organizations such as Aid Access, Women Help Women and Just the Pill reported that their posts had been blurred or deleted, and their accounts temporarily suspended or made invisible in search results. Meta claimed that the actions were related to rules surrounding the sale of medicines without certification, but also acknowledged "over-enforcement." The incidents show how content about socially relevant medical information, here reproductive care, comes under pressure from platform decisions. Although these providers are not licensed media service providers, the example illustrates the risk of excluding important information from public discourse. It reinforces calls for structural safeguards against arbitrary content moderation, such as the media privilege for MSPs in Article 18 EMFA.

What does Article 18 EMFA regulate?

The European Media Freedom Act (EMFA).[15] introduces under Article 18 a so-called "media privilege" to protect media freedom. This aims to give media service providers (MSPs)[16] protect against unwarranted content removal by Very Large Online Platforms (VLOPs)[17], such as through advance notice, a 24-hour response time and dispute resolution mechanisms.

Conclusion

Article 18 EMFA recognizes the power of platforms and offers licensed media service providers protection against unjustified content moderation. At the same time, media privilege is limited to a select group, leaving other media actors out of the picture. This raises questions about the equal protection of media freedom in the digital domain. The following two articles explore the implications of this delineation and the extent to which Article 18 EMFA contributes to protecting media freedom.   For your information: research approach and delineation This article examines what effect the media privilege in Article 18 EMFA has on media freedom. It focuses on the rights this privilege grants to media service providers (MSPs), the obligations it imposes on very large online platforms (VLOPs), and the consequences of choosing to apply the privilege exclusively to MSPs. In this first article, I analyze how this limited access to the privilege affects the media freedom of other actors. The second article shows the extent to which rights for MSPs and obligations for VLOPs actually protect media freedom. This research is based on an analysis of the EMFA, the Charter, the ECHR, relevant case law, Committee recommendations and literature. It also examined key judgments with relevance level 1 in the ECtHR's HUDOC database.

This article focuses on the core of Article 18 EMFA; the media privilege and the consequences of limiting it to MSPs. It excludes broader procedural aspects and transparency obligations, such as additional self-declarations by MSPs and annual disclosures by VLOPs, as these do not directly affect media freedom. Finally, it is assumed that MSPs meet their obligations and responsibilities to claim a high degree of freedom of expression protection.

[18]   [1]  Delfi AS v. Estonia (2015), ECHR 64569/09, §110. [2]  Cengiz and Others v. Turkey (2015), ECHR 48226/10 and 14027/11, §52. [3]  Leerssen, P. (2015). Cut Out By The Middle Man: The Free Speech Implications Of Social Network Blocking and Banning In The EU. Journal of Intellectual Property, Information Technology and Electronic Commerce Law, 6(2), 99-119, p.2. <https://www.jipitec.eu/issues/jipitec- 6-2-2015/427> [4]  Leerssen, P. J. (2020). The Soap Box as a Black Box: Regulating Transparency in Social Media Recommender Systems. European Journal of Law and Technology11(2), p.2. <https://ejlt.org/index.php/ejlt/article/view/786> [5]  OOO Regnum v Russia (2020), ECHR 22649/08, §67 McGonagle, T. (2013). How to address current threats to journalism? The role of the Council of Europe in protecting journalists and other media actors. (MCM; No. (2013)009). Republic of Serbia, Ministry of Culture and Information, p.24. <http://www.coe.int/t/dghl/standardsetting/media/Belgrade2013/How%20to%20address%20current%20threats%20to%20journslism%20-%20MCM(2013)009_en_Report_McGonagle.pdf> [6]  European Union, Charter of Fundamental Rights of the European Union [2000] OJ C364/01. ("the Charter"). [7]  Article 52(3) EU Charter [8]  Council of Europe, European Convention on Human Rights, ETS No. 5, as amended by Protocols Nos. 11 and 14, ("ECHR"). [9]  McGonagle, p. 24. [10]  Handyside v. U.K. (1976), ECHR 5493/72, §49. [11]  Observer & Guardian v. the United Kingdom (1991), ECHR 13585/88, §60. [12]  Olander, O., Schreckinger, B., & Kern, R. (2022, December 15). Twitter suspends journalist accounts without explanation, angering lawmakers and those affected. POLITICO. <https://www.politico.com/news/2022/12/15/twitter-suspends-journalists-musk-00074261> [13]  D. Keller & P . Leerssen, 'Facts and where to find them: empirical research on internet platforms and content moderation'In: N. Persily & J.A. Tucker (Eds.), Social media and democracy: the state of the field, prospects for reform (p.220-251), p.236. [14]  C. Cain Miller, K. Conger & M. Isaac (2025, Jan. 23), Instagram and Facebook Blocked and Hid Abortion Pill Providers' Posts'. NEW YORK TIMES.https://www.nytimes.com/2025/01/23/technology/instagram-facebook-abortion-pill-providers.html> [15]  REGULATION (EU) 2024/1083 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of April 11, 2024 establishing a common framework for media services in the internal market and amending Directive 2010/13/EU (European Media Freedom Regulation). ("EMFA"). [16]  See Chapter 1. [17]  Article 33 DSA. [18]  Handyside v. U.K. (1976), ECHR 5493/72, §49.

Share article

Comments

Leave a comment

You must be logged in to post a comment.