What is chat control and which jurisdictions are implementing it?

Checked on December 20, 2025
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

This fact-check may be outdated. Consider refreshing it to get the most current information.

Executive summary

Chat Control is the shorthand used by critics for the European Union’s proposed Regulation to Prevent and Combat Child Sexual Abuse (CSAR), a legal package that would obligate digital service providers to detect and report child sexual abuse material — and which at times has been framed to include mandatory scanning of private, including encrypted, communications [1] [2]. The proposal has driven an intense political and technical fight throughout 2024–2025, with Denmark’s Council presidency driving negotiations, governments now split between support, opposition and “undecided,” and key changes in late 2025 that removed or softened the earlier mandatory-scanning language even as debate continues [3] [4] [5].

1. What “Chat Control” actually means: a regulation to force detection and reporting

At its core the CSAR proposal seeks to create an EU-wide legal duty for providers to detect, report and help remove child sexual abuse material — measures that proponents say are necessary to protect children online and that critics say would legalise widespread automated surveillance of private communications [1] [2]. Early drafts and many campaigners focused on a provision that would have amounted to mass, client‑side scanning of messages and files (including images), an approach branded “Chat Control” by digital‑rights groups because it would require scanning before or during encryption [3] [2].

2. Technical and legal flashpoints: encryption, client‑side scanning and precedent

Security experts, companies and the European Data Protection Supervisor warned that client‑side scanning effectively weakens end‑to‑end encryption and creates security vulnerabilities, while the European Court of Justice has previously said permanent automatic analysis of private communications infringes fundamental rights — a tension that has been central to opponents’ arguments [3] [5]. Technology firms and cryptographers argued that mandated scanning would either force them to break encryption or to build detection into devices and apps in ways that create exploitable backdoors [5] [3].

3. Who is pushing and who is pushing back inside the EU

The CSAR was tabled by the European Commission in May 2022 and became a major Council file under the Danish presidency in 2025, which pushed for rapid agreement and originally for stronger mandatory scanning clauses [1] [3]. Member states have shifted stances: dozens reportedly supported the Danish draft at points, but others — including a reconfigured Germany and several civil‑liberties‑minded countries and MEPs — have publicly opposed mandatory measures, producing a patchwork of backing, opposition and “undecided” positions ahead of multiple vote deadlines in 2025 [4] [6] [7].

4. The legislative rollercoaster in 2025: votes, shelving and edits

Throughout 2025 the proposal faced several procedural pivots: planned Council votes in October were delayed or became contentious, the Danish presidency softened language to avoid an explicit mandatory‑scanning clause to unlock agreement, and late‑November texts reportedly removed the mandatory scanning wording though critics say other clauses create de facto pressure on companies to scan [8] [9] [10]. Some reporting describes the November 26 Council text as an agreement that avoids forcing apps like Signal and WhatsApp to scan messages directly, while other sources say the new wording leaves “voluntary” or “risk mitigation” hooks that could functionally compel scanning by other means [9] [10] [5].

5. Which jurisdictions are implementing or poised to implement chat‑style rules?

Available reporting and advocacy trackers focus almost exclusively on the EU and its Member States: the proposal is an EU regulation that — if adopted — would create obligations across the EU but leave significant scope for national implementation choices, and late‑2025 edits reportedly let individual Member States decide to deploy scanning measures, effectively creating a patchwork rather than a uniform mandate [1] [10]. The sources do not document implementations outside the EU in this dataset; they do note a 2021 temporary derogation to allow some scanning under EU rules and that individual EU governments (and companies) might implement measures differently depending on national decisions [1] [10].

6. Where the debate stands and the open questions

As of late 2025 the legislative outcome remains contested and legally uncertain: the Council negotiations have moved away from explicit mandatory client‑side scanning but critics warn that “voluntary” or “risk mitigation” language and enforcement levers (fines, market access) could produce the same practical pressure on platforms to scan; supporters insist the changes balance child protection and encryption [5] [10] [9]. The material covered here does not settle whether a final, enforceable regime will require scanning inside end‑to‑end encrypted chats at scale — only that the EU process has shifted, that member states are divided, and that the firestorm of technical and rights‑based objections continues to shape negotiations [5] [4] [3].

Want to dive deeper?
What technical alternatives to client‑side scanning exist for detecting CSAM without weakening end‑to‑end encryption?
Which EU member states have publicly opposed or supported the CSAR/Chat Control at each key vote in 2025?
How have technology companies like Signal and WhatsApp signaled they would respond if national laws required scanning of encrypted messages?