Reject eu chat control reject mandatory and voluntary chat control
Executive summary
EU governments agreed a new negotiating position on the Child Sexual Abuse Regulation (CSAR) — dubbed “Chat Control” — on 26 November 2025 that removes mandatory, blanket detection orders in favour of a framework that makes scanning “voluntary” for providers or subject to mitigation requirements [1] [2]. Privacy groups, scientists and some tech firms call the compromise a dangerous back‑door that still legitimises mass scanning via “voluntary” measures or risk‑mitigation obligations; supporters argue it keeps pressure on providers to fight child sexual abuse online [3] [4] [5].
1. What changed in the Council’s deal — mandatory scanning out, voluntary/mitigation in
After years of stalled talks, EU ambassadors adopted a common negotiating position on 26 November that drops earlier language for mandatory detection orders and instead strengthens obligations for providers to apply mitigation measures or opt into voluntary scanning schemes; the text lets national authorities require removal/blocking and creates an EU Centre to coordinate responses [2] [6] [4].
2. Why critics say “voluntary” is a wolf in sheep’s clothing
Digital‑rights groups, researchers and campaigners argue the new wording still risks de facto mass scanning because companies facing fines, market exclusion or enforcement pressure may adopt client‑side scanning or other technical measures; critics say voluntary frameworks can be coerced into practice and so still undermine end‑to‑end encryption and anonymity [7] [8] [9].
3. Security and technical objections from scientists and cryptographers
More than 600 cryptography and security researchers warned the measure endangers digital security and privacy: client‑side scanning introduces new attack surfaces, creates false positives that can harm innocent users, and weakens protections relied upon by journalists, activists and victims [10] [11] [12].
4. Political split inside the EU — Council vs Parliament vs member states
The Council’s position differs sharply from the European Parliament’s 2023 stance, which sought to exclude encrypted communications from detection orders; member states remain divided, with coverage showing close votes and some governments (notably Germany in previous moments) opposing mandatory scanning — meaning final outcome depends on trilogue negotiations [2] [13] [14].
5. The “legal forest” — fundamental rights and court precedent
Legal experts point to European Court of Justice rulings that indiscriminate content surveillance can violate the essence of privacy rights; organisations framing Chat Control as incompatible with EU fundamental rights stress the draft still contains elements that risk such a clash, even if the route to surveillance is now less explicit [11] [13].
6. What proponents say — protecting children and improving enforcement
Supporters within the Council and some industry groups argue that the regulation is aimed at preventing child sexual abuse online and that a compromise is needed so platforms and authorities can act more effectively, including setting up an EU Centre to help with reporting and takedowns [6] [4].
7. Stakes for companies and users — leave, litigate, or comply?
Encrypted‑service providers have issued stark warnings: some say they would exit the EU market rather than implement scanning, while trade groups urge negotiators to protect encryption and workable safeguards; whether firms litigate, withdraw or build mitigations will shape practical impact [15] [16] [4].
8. What to watch next — trilogues, Parliament, and national pressure
The Council mandate now moves to trilogue talks with the European Parliament and Commission where the more restrictive parliamentary position can force changes; timelines point to extended negotiations into 2026 and continued lobbying from civil society, scientists and industry [2] [5].
Limitations and competing viewpoints
Available sources document the Council’s 26 November position and the arguments on both sides, but do not provide the final trilogue text or definitive outcomes of future votes — those remain to be negotiated [6] [2]. Sources disagree on the degree of risk: Council and some industry statements frame the change as a responsible compromise to protect children [6] [4], while privacy advocates and scientists call it a legal and technical Trojan horse that still enables mass surveillance [7] [10] [17]. Decide which harms you prioritise — child‑safety enforcement gaps or systemic privacy and security risks — and follow trilogue developments and national parliamentary debates closely [2] [14].