Major tech companies will continue scanning private communications for child sexual abuse material despite the expiry of the EU’s temporary legal framework. Yet the European Commission has stayed silent on whether the practice complies with EU law.

Speaking at the Commission’s midday briefing on Tuesday, the spokesperson for home affairs repeatedly avoided answering whether ongoing detection by companies such as Google, Meta, Microsoft, and Snap remains lawful.

Instead, the spokesperson stressed that “proactive detection by companies is essential to protect children in the EU and beyond,” adding that “the protection of our children should not be subject to autonomous business decisions made by companies.”

Legal uncertainty, but scanning continues

The legal uncertainty follows the expiration on 3 April of the so-called “Chat Control 1.0” derogation, which had allowed companies to voluntarily scan private communications for known abuse material.

Just one day later, Google, alongside Meta, Microsoft, and Snap, issued a joint statement reaffirming their intention to continue these practices. “As EU institutions continue to negotiate an immediate, interim solution and durable framework, signatory companies (Google, Meta, Microsoft, and Snap) reaffirm their continued commitment to protecting children and preserving privacy, and will continue to take voluntary action on our relevant Interpersonal Communication Services”, the companies said in a press release published on 4 April.

You might be interested

They warned that the expiry risks leaving children “less protected”, blaming EU institutions for failing to agree on an extension. The statement frames the expiry as a risk to child safety, warning that Europe could become “less protected from the most abhorrent harm” due to what the companies describe as EU inaction.

A gap the EU failed to close

The European Parliament rejected a last-minute attempt on 26 March 2026 to extend the rules, just days before their expiry on 3 April. Talks with member states collapsed after 11 March, when MEPs backed a compromise to prolong the framework until August 2027, but with stricter limits — including targeted detection, a ban on scanning end-to-end encrypted messages, and restrictions to specific suspects or groups.

With no agreement in place, the EU now faces a regulatory gap, one that companies are filling on their own terms. The European Commission has yet to clarify whether such practices comply with EU law, as negotiations on “Chat Control 2.0”, which would replace the temporary framework with a permanent one, continue.