What exactly is proposed in the revised "chat control" proposal from dec. 2025?
This fact-check may be outdated. Consider refreshing it to get the most current information.
Executive summary
The revised “Chat Control” (CSAR) proposal in late 2025 removes the Council’s earlier explicit requirement for mandatory, universal scanning of encrypted private communications but keeps obligations and language that critics say can recreate a scanning duty in practice — including requirements on providers to take “all appropriate risk mitigation measures” and new age‑verification/child‑protection duties (see Parliament and Wikipedia summaries) [1] [2]. COREPER approval in November 2025 advanced a Danish-compromise text into trilogue despite strong warnings from privacy advocates, MEPs and civil‑society groups that the text still risks undermining end‑to‑end encryption and could lead to device‑side or client‑side scanning by de facto means [3] [4].
1. What the late‑2025 text actually changes — and what it keeps
The Danish revision stopped short of an explicit, blanket mandate to scan encrypted messages, turning mandatory “detection orders” into a different model where searches could remain voluntary for providers; yet it introduces obligations for providers to implement measures to prevent CSAM and to block or remove content and accounts, which opponents say can be interpreted as effectively requiring scanning on devices or before encryption [5] [2]. COREPER’s narrow approval put that compromise forward to trilogue even as the European Parliament’s position — which emphasizes court oversight and safeguards for encryption — remains at odds with the Council text [3] [4].
2. Technical route and the encryption debate: explicit ban or back‑door?
The Parliament inserted protections for end‑to‑end encryption and sought judicially supervised, targeted measures; Council negotiators in Denmark reframed the approach to avoid an explicit interception order but included broad “risk mitigation” duties that critics (including MEP Patrick Breyer) say could be a legal back‑door to client‑side scanning — the same technical approach that would effectively analyze content on devices before encryption [4] [2]. Privacy bodies and tech groups repeatedly warned that any regime compelling client‑side scanning undermines E2EE and creates security vulnerabilities [4] [6].
3. Where child protection meets age verification and exclusion risks
The revised text reportedly contains provisions tied to preventing minors’ exposure to CSAM that some reporting says would require age verification measures and could block under‑16s from using messaging apps — a move privacy advocates argue risks digital exclusion and forces platforms to collect sensitive personal data [7] [5]. Critics say these measures could criminalize minors’ behaviour (for example, non‑consensual or private sexting) and have disproportionate collateral effects; the sources document these as central points of contention rather than settled outcomes [7] [5].
4. Political dynamics: narrow approval, stalled consensus
COREPER’s close vote in late November 2025 advanced the Danish compromise into trilogue negotiations but did not signal broad consensus — several member states and the European Parliament have signalled opposition or reservations, and earlier presidencies failed to secure qualified majorities [3] [8]. Multiple sources note the proposal has repeatedly stalled under previous presidencies and faces organised pushback from civil society, tech companies and some national governments [8] [2].
5. Evidence, effectiveness and false‑positive risks
Parliamentary questions and civil‑society critics underline that the technical basis for automated scanning systems produces high false‑positive rates and that effectiveness at protecting children when deployed at scale has not been proven; these criticisms question whether the trade‑offs to privacy and encryption can be justified by demonstrated gains [1] [4]. The EDPS/EDPB and security experts have warned the proposal could lead to “de facto generalized and indiscriminate scanning” [9].
6. Competing perspectives and what sources emphasise
Supporters framed the Danish compromise as a pragmatic step to empower providers and states to better detect CSAM while avoiding an explicit break of encryption [3]. Opponents — including MEPs, privacy NGOs and many technologists — argue the wording still permits or compels scanning, undermines encryption, risks mass surveillance and would have chilling effects on lawful speech and anonymity [4] [6] [9]. Reporting differs on whether the compromise meaningfully protected encryption or only obscured the same obligation in new language [5] [3].
7. Bottom line and open questions
The late‑2025 revision narrowed the Council’s explicit mandate for blanket scanning but left legally ambiguous duties that critics say will recreate scanning in practice; COREPER pushed the text forward despite clear disagreement from Parliament and civil‑society actors [3] [4]. Available sources do not mention final trilogue outcomes or a definitive legislative text that resolves the central technical and rights‑based disputes; those are the next pivotal facts to watch [3] [2].