The European Union awaits a decisive vote on the “Chat Control” regulation, formally known as the Child Sexual Abuse Regulation (CSAR). The law would require mandatory scanning of all digital communications to detect child sexual abuse material, triggering concerns over mass surveillance and the privacy of European citizens.
From temporary measure to permanent law
The regulation has a long and challenging history going back to 2020. The proposal began as a temporary measure allowing the voluntary scanning of child sexual abuse material (CSAM) and the European Commission made it permanent in May 2022. Then it would obligate companies to detect, report and remove illegal content, even on services that use end-to-end encryption, raising privacy and security concerns.
Since then, the European Parliament and the Council have continued to shape the regulation. The European Parliament has amended the draft to strengthen privacy protections, introducing independent audits for detection tools, limiting the scope of scanning, and creating a Victims’ Consultative Forum.
Meanwhile, Council negotiations to pass the law repeatedly stalled under previous presidencies, with Belgium and Poland unable to reach a compromise. After Denmark began its presidency in July stating it would make online protections for minors a priority, representatives pushed the proposal toward adoption. The next phase includes the Council finalising its position on 12 September and voting on the proposal in mid-October.
Member states divided
Council negotiations revealed a split among member states. Fifteen countries have backed the Danish compromise, while seven member states remain undecided, among them Germany and the Netherlands. A smaller group of four states actively oppose the law: these are Austria, Poland, the Netherlands and, most recently, Belgium and the Czech Republic.
Czech Prime Minister Petr Fiala expressed his position in a post on X last Tuesday, stressing that Prague would not support the monitoring of citizens’ private correspondence. “Protecting our children is important” he wrote, “but we must achieve it differently. Not in a way that means breaching the privacy of millions of people.”
Messages scanned before sending
A key controversial aspect of the regulation is client-side scanning, which means that messages, images, videos, or files are examined on a user’s device before being sent. For example, a photo shared over WhatsApp could be scanned against a database of prohibited material and flagged for review by service providers or authorities even before being encrypted.
Critics warn that this approach carries significant risks. Patrick Breyer, a former Member of the European Parliament and active voice against the “Chat Control” proposal, noted this automated technology can induce errors, claiming that under this method, German authorities received over 99,000 wrong reports over private chats and photos in 2024. Data from Ireland shows a similar pattern. Only 852 of 4,192 automated CSAM reports in 2022 involved illegal content, highlighting the high rate of false positives.
Some industry players, such as Tutanota, a German tech company, described the proposal as potentially leading to an “Orwellian world,” arguing that mandatory scanning could enable the mass surveillance of private communications.
You might be interested
The company highlighted that the government and military accounts exception raises concerns about fairness and accountability. “If Chat Control is passed, opaque AI algorithms will decide whether your personal messages and private pictures are flagged or leaked. This undermines online privacy for over 400 million EU citizens,” the firm stated.
Concerns over mass surveillance
Civil society organisations, cybersecurity experts, and privacy-focused groups continuously oppose the regulation. Campaigns such as Stop Scanning Me and Fight Chat Control urge citizens to take action against the proposal, likewise warning that mandatory scanning could lead to mass surveillance and weaken online privacy.
Patrick Breyer is concerned that the draft regulation could cover the surveillance of private communications. He argues that the Commission seeks to “oblige providers to search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately.” While the stated aim is to combat child sexual exploitation, Breyer argues the outcome would be “mass surveillance by means of fully automated real-time surveillance of messaging and chats and the end of privacy of digital correspondence”.
The result will mean mass surveillance by means of fully automated real-time surveillance of messaging and chats and the end of privacy of digital correspondence – Patrick Breyer, Digital Freedom Activist
The European Data Protection Board shared a statement drawing attention to the fact that the proposal has issues that could allow for “general and indiscriminate monitoring of private communications”. The Board also criticised the inclusion of detection orders for new CSAM, citing “high error rates of these technologies are still concerning” and urged lawmakers to ensure that any final regulation fully respects fundamental rights to privacy and data protection.
Child safety first
Child protection advocates argue that the regulation is essential to detect and prevent online abuse, arguing that encryption can conceal criminal activity. More than 50 children‘s rights NGOs represented by ECLAG (European Child Sexual Abuse Legislation Advocacy Group) welcomed the regulation, stating that “known, unknown CSAM and grooming must all be legislated upon in order to ensure no children are left unprotected online.”.
At the core of this regulation is protecting victims from further victimisation and trying to prevent this crime from happening. – ECLAG
ECLAG praised the rapporteur’s emphasis on risk assessment, mitigation measures, and the inclusion of a Victims’ Consultative Forum, while also defending the role of voluntary detection as “a vital part of the existing ecosystem”. At the core of the proposal, the coalition underlined, is “protecting victims from further victimisation and trying to prevent this crime from happening” and they urged lawmakers to ensure that safeguards for children remain at the centre of the final regulation.
Next steps
The European Council will formalise its position on 12 September and plans to hold a vote on 14 October, marking a decisive phase in the legislative process. If the Council approves the Danish compromise, trilogue negotiations with the European Parliament and the Commission will begin in early 2026.