The proposed Regulation to Prevent and Combat Child Sexual Abuse has been a political and legal hot topic since its introduction by the European Commission in 2022. Discussions within the Council of Ministers of the European Union are currently stalled because of disagreements that have arisen between several member states over controversial privacy aspects of the law.

Friday, Sept. 28, could have been an important day for the future of European citizens' privacy in electronic communications. The Justice and Home Affairs Group of the Council of Ministers ("the Council"), the EU's co-legislator, originally had on its agenda the discussion of one of the most controversial legislative proposals ever introduced by the European Commission: the "Regulation on preventing and combating sexual abuse of children."
The Council, currently led by the Spanish presidency, was to adopt its institutional position on the proposal, and later discuss it with the European Parliament. PONT Data & Privacy has learned from internal sources within the Council that the proposal has been removed from next Friday's agenda (1). This decision was reportedly made due to the controversial nature of some of the bill's provisions. The representative ministers therefore decided to take additional time to consider adjustments to the text. German news organizations netzpolitik.org and heise.de have since reported the same (2). It is not clear whether the Council will resume debate at its next meeting around Oct. 20.
The Council's hesitation is no surprise. The proposed regulation has generated debate throughout the Union because of the delicate interests at stake. The proposal aims to create a legal framework for combating (online) child sexual abuse based on two pillars. First, by imposing various obligations on online service providers to detect, report, remove and block child pornography material (CSAM) and "grooming" (child grooming) on their services. Second, by creating an EU Child Sexual Abuse Center that would support enforcement agencies in reviewing content reported by service providers. That first pillar has come under fire in recent months for its potential heavy intrusion into citizens' privacy.
According to the draft text, intended service providers, including intercommunication service providers (such as WhatsApp, Telegram, Signal, etc.), are required to take reasonable measures to assess, mitigate and notify competent authorities that their services may be used for online child sexual abuse. On the other hand, competent authorities may request a court or and administrative agency to have surveillance technologies installed and operated to detect CSAM on their services. Such detection orders are legitimate when there is a "substantial risk" that the service will be used to disseminate CSAM. Moreover, the interests that an order seeks to protect must outweigh the negative impact on the rights and interests of the parties involved. The proposal provides a number of safeguards to ensure that any privacy breach is limited to what is "strictly necessary," in the context of measures taken, scope of risk and duration. Intended providers are also required to include such safeguards in their internal procedures, such as by ensuring the presence of a person in detection activities.
The EU Home Affairs Commissioner introduced the proposal as an urgent step to prevent and combat child abuse, but which still takes into account the privacy of European citizens, including children. In contrast, the European Commission's proposal has drawn criticism from all sides because of the potential privacy implications associated with the detection framework. For example, the European Data Protection Supervisor ("EDPS") warns that the regulation "could become the basis for de facto generalized and random scanning of the content of virtually all types of electronic communications of all users in the EU/EEA" (3). The LIBE Committee of the European Parliament has questioned the effectiveness of the proposal from a principle standpoint, due to the inherent weaknesses in the problem statement and the (technological) approach proposed for the prosecution of offenders. According to the LIBE Committee, any general obligations for data retention or surveillance (currently prohibited under EU law) would be illegitimate in light of the objectives pursued (4).
The European Parliament rapporteur for this legislation has prepared a report with nearly two thousand amendments. Yet this only puts a "restriction" on monitoring CSAM in encrypted conversations. A study by the Institute for Information Law (IViR) in Amsterdam argues that the proposed measures are disproportionate and unrelated to the stated purpose. This violates the provisions of the EU Charter of Fundamental Rights (5). Several other criticisms come from a wide range of stakeholders, including civil society, child protection experts, technologists, national authorities and the technology sector (6).
This is not the first time a European Commission bill has been widely criticized for mass surveillance. The infamous Traffic Data Retention Directive adopted in 2006 met with similar opposition. Later, the Court of Justice of the European Union declared the law null and void (7). The adoption of the "Regulation on Countering the Distribution of Terrorist Online Content," which requires hosting providers to comply with removal requests for terrorist online content within one hour, was also cause for concern, although to a lesser extent. However, none of these laws ever targeted such an indeterminate spectrum of citizens throughout their private communications, and based on such a low threshold, as the proposed regulation.
Therefore, controversies within the Council of Ministers are to be expected. After the last negotiations in July, the Spanish Council Presidency drafted a compromise text, which received support from nine member states. Other representatives, from Germany, Austria and Poland, are still too concerned about the fundamental rights associated with anticipated measures such as the scanning of encrypted communications (8). A veto by four states representing 35 percent of the EU population would be enough to block Council approval of the proposal. Negotiations will continue.
Meanwhile, the parliament in the UK is likely to pass a similar law in the near future, although it is aimed primarily at online platforms, despite international criticism (9). It is now the EU legislature's turn.
(1) https://www.consilium.europa.eu/en/meetings/jha/2023/09/28/
(2) https://www.heise.de/news/Widerstand-aus-Deutschland-Abstimmung-im-EU-Rat-zur-Chatkontrolle-geplatzt-9310335.html ; https://netzpolitik.org/2023/internes-protokoll-eu-rat-verschiebt-abstimmung-ueber-chatkontrolle/;
(3) The maximum duration of a detection order is 24 months in case of known CSAM, and 12 months for grooming;
(4) https://edpb.europa.eu/system/files/2022-07/edpb_edps_jointopinion_202204_csam_en_0.pdf
(5) https://www.europarl.europa.eu/RegData/etudes/STUD/2023/740248/EPRS_STU(2023)740248_EN.pdf
(6) https://www.ivir.nl/publicaties/download/CSAMreport.pdf
(7) https://edri.org/our-work/most-criticised-eu-law-of-all-time/
(8) http://www.vorratsdatenspeicherung.de/images/DRletter_Malmstroem.pdf;
(9) https://netzpolitik.org/2023/internes-protokoll-eu-staaten-starten-endspurt-zur-chatkontrolle/;
(10) https://www.gov.uk/government/news/britain-makes-internet-safer-as-online-safety-bill-finished-and-ready-to-become-law
