“Chat Control” faces a setback as Germany joins the opposition, forming a coalition of states capable of halting the law. The proposed legislation, formally named Child Sexual Abuse Regulation, would require messaging platforms such as WhatsApp, Signal, and Instagram to scan private communications for child sexual abuse material (CSAM). This Friday, all member states are expected to declare their positions on the regulation, setting the stage for the October 14 Council vote. With four opinions missing, it is already clear that a blocking majority has been established.

Council negotiations on the draft law, originally put forward by the Commission in 2022, have once again exposed sharp divisions. According to the latest data, 15 member states back the regulation. This includes Denmark, Ireland, Spain, Italy, France, Hungary, Latvia, Lithuania, Malta, Portugal, Bulgaria, Croatia, and Cyprus.

But opposition has grown in the past weeks, with eight countries against the proposal: Austria, Belgium, the Czech Republic, Finland, the Netherlands, Poland, Luxembourg, and most decisively, Germany. Berlin’s decision to reject the measure has shifted the balance, creating a blocking minority capable of preventing the regulation from advancing.

“These are indeed very serious intrusions into privacy, so the question remains as to the extent of the intrusion” – German Federal Ministry of Justice

Regarding Germany’s position, a representative from the Federal Ministry of Justice highlighted the serious privacy implications of the proposed regulation. They noted that “these are indeed very serious intrusions into privacy, so the question remains as to the extent of the intrusion”, and they “could not fully support the Danish position”

So far, four governments, Estonia, Greece, Romania, and Slovenia, have not shared their positions.

You might be interested

Unfinished business from 2022

This regulation has a complex legislative history dating back to 2020. Originally proposed as a temporary framework for voluntary detection and reporting CSAM, it was made permanent by the European Commission in May 2022, obliging service providers to identify, report, and remove illegal content, even on end-to-end encrypted platforms. 

Since then, both the European Parliament and the Council have amended and debated the draft. This led to some changes, such as introducing independent audits of detection tools, scope limitations, and consultative forums for victims. Council negotiations have repeatedly stalled under successive presidencies, with compromises remaining elusive until the Danish presidency in 2025 pushed the proposal forward.

Scientific community calls regulation “not feasible

Just days before the Council meeting, 587 researchers from 34 countries sent a joint letter to EU lawmakers warning that the regulation threatens digital privacy and security. While the letter welcomed some improvements, such as accelerated voluntary reporting, it stressed that the regulation is “not feasible” and fails to address fundamental technical and security flaws.

The scientists criticised the proposed client-side scanning. This method examines messages before they are encrypted, arguing it “ undermines the functionality of E2EE by accessing the private data through the detecting mechanism and introduces a single point of failure”. Narrowing the scope to images and URLs does little to solve the problem, as detection algorithms are highly unreliable. 

“Existing research confirms that state-of-the-art detectors would yield unacceptably high false positive and false negative rates, making them unsuitable for large-scale detection campaigns” – Joint statement of scientists and researchers

The letter also warns against the law’s reliance on AI and machine learning. Algorithms cannot reliably distinguish illegal content from benign material, such as consensual teen messages or family photos, and are easy to evade. “To the best of our knowledge there is no machine-learning algorithm that can perform such detection without committing a large number of errors… and that all known algorithms are fundamentally susceptible to evasion”.

Beyond technical concerns, the letter highlights wider societal risks. Mandatory detection could expand to other types of content in the future, creating “unprecedented capabilities for surveillance, control, and censorship” and a high risk of “function creep and abuse by less democratic regimes”. 

Privacy concerns vs. child protection

A key point of contention remains client-side scanning, which critics describe as intrusive and prone to errors. Digital freedom advocates warn that mandatory scanning could lead to mass surveillance. At the same time, statistics from Germany and Ireland show high false-positive rates in automated CSAM detection.

Patrick Breyer, a digital rights activist, leading “Fight Chat Control” movement and former Member of the European Parliament, stressed that the law cannot be applied “without breaking encryption” and “the myth that exempting encrypted services would solve all problems has now been proven wrong”. To him, the law threatens to end privacy in European digital correspondence.

In opposition, child protection organisations argue that the regulation is essential to combat CSAM and online grooming, noting that encryption can conceal criminal activity. The European Child Sexual Abuse Legislation Advocacy Group emphasised that the regulation is meant to protect victims and prevent further abuse, urging lawmakers to retain safeguards for children at the centre of the law.

Council vote in October

With member states having adopted their official position, attention now turns to the formal vote. It is scheduled for 14 October and will ratify the Council’s mandate for negotiations. This step formally clears the way for trilogue discussions with the European Parliament and the European Commission, where differences and amendments can be considered.