Eu chat control
This fact-check may be outdated. Consider refreshing it to get the most current information.
Executive summary
EU governments have negotiated a contentious package known as the Child Sexual Abuse Regulation (nicknamed “Chat Control”) that aims to curb online child sexual abuse material (CSAM) but has been repeatedly criticised for risking mass scanning of private messages and weakening end‑to‑end encryption [1] [2]. In late November 2025 member states struck a political compromise that removed a mandatory encryption‑breaking clause and made scanning “voluntary” for states or providers — a change privacy groups say may still enable broad surveillance through pressure on companies [3] [4].
1. What “Chat Control” is and how it evolved
The proposal, formally the Regulation to Prevent and Combat Child Sexual Abuse (CSAR), was introduced in 2022 to force detection and removal of CSAM; critics shorthand it “Chat Control” because early drafts would have required providers to scan private communications, including encrypted messages, or to implement technical measures that undermine encryption [2] [5]. The file has been renegotiated through several EU presidencies, stalled repeatedly, and reappeared in 2025 with fresh compromise language from the Danish presidency [6] [7].
2. The November 2025 political compromise and what changed
On 26 November 2025 EU member states announced a deal that, according to reporting, forestalled a hard mandate forcing apps like Signal and WhatsApp to perform client‑side scanning of all end‑to‑end encrypted chats; the Council text instead made scanning effectively “voluntary,” a semantic shift that unlocked agreement among several countries [3] [8]. Tech and privacy observers interpret “voluntary” as a potential back‑door: companies may face commercial or regulatory pressure to implement scanning to avoid fines or access blocks, even without an explicit EU mandate [4] [9].
3. Privacy and security objections from experts and advocates
Digital‑rights groups, the European Data Protection Board and many technologists have warned that client‑side or mandatory scanning undermines encryption and creates systemic risks: automated detectors produce false positives, can expose sensitive populations, and set a precedent for state access to private communications [5] [10]. Advocacy groups argue the regime could normalise mass monitoring and incentivise surveillance architectures that persist beyond the temporary measures critics call “Chat Control 1.0” [10] [9].
4. Government and child‑protection perspectives
Member states and child‑protection advocates say stronger rules are needed because online platforms have enabled widespread distribution of CSAM and that new regulatory tools are required to detect, remove and investigate abuse—hence the Council’s renewed push to finalise the law after years of stalled negotiation [3] [1]. Proponents frame the compromise as balancing child safety with technical and legal safeguards by removing an explicit encryption‑breaking obligation [3].
5. What companies say and market consequences
Signal’s leadership warned publicly that if required to implement backdoors or client‑side scanning it would exit the EU market rather than compromise end‑to‑end encryption, underscoring a real commercial choice for privacy‑first providers [11]. Larger, U.S.‑based platforms have historically argued against technical obligations that would undermine encryption, citing security trade‑offs and feasibility concerns [12] [5].
6. The devil in semantics: “mandatory” vs “voluntary” scanning
Multiple sources highlight that the shift from mandatory to “voluntary” scanning in Council texts may be more a tactical change than a substantive safeguard: critics say “voluntary” can be functionally mandatory if non‑compliant services face severe penalties, market exclusion or regulatory hurdles [4] [9]. Others note the Council’s move removed the most controversial clause but left authorities broad powers to encourage or require provider action under different legal instruments [3] [2].
7. Political and procedural next steps and uncertainties
The dossier has repeatedly passed between the Commission, Council and Parliament; even after the Council’s political agreement in November 2025, the regulation must still navigate EU legislative checks and could be altered further, with the European Parliament and civil society continuing to push for narrower, targeted measures [1] [10]. Available sources do not mention the final text’s exact operational rules for detection orders or penalties in full detail.
Bottom line — competing priorities collide
EU capitals frame the regulation as necessary to stop child abuse online [3]; privacy advocates say it risks creating a permanent surveillance infrastructure and undermining encryption [10] [9]. The November 2025 compromise removed the explicit encryption‑breaking mandate but left a contested “voluntary” mechanism that critics say could achieve similar ends through coercion [3] [4]. The final legal text and how courts and markets react will determine whether the balance tips toward child protection tools that respect privacy or toward practices that erode encrypted communication across Europe [2] [12].