Does EU Chat Control require platforms to retain deleted or deactivated user content for scanning?
Executive summary
EU-level “Chat Control” debates in 2025 shifted away from a blanket, mandatory requirement to scan encrypted messages; the Danish Presidency’s late-2025 compromise makes scanning voluntary for Member States and removes the forced breaking of end‑to‑end encryption [1] [2]. Civil‑liberties groups, MEPs and technical experts continue to warn the draft still contains record‑keeping and mass‑scanning risks and that the proposal’s specifics remain contested [3] [4] [5].
1. What supporters proposed: mandatory scanning and broad retention fears
Early versions of the Commission’s Chat Control/CSAM regulation sought mandatory, system‑level measures that civil society and some MEPs described as “widespread scanning of encrypted messages” and likened to mass surveillance and record‑keeping [6] [3]. Opponents repeatedly highlighted that the Commission at times contemplated client‑side or provider‑side detection that would change how private communications are handled, raising alarms about normalising content scanning and potential requirements to retain evidence or metadata [7] [5].
2. The Council compromise: voluntary scanning, not mandatory deletion‑retention rules
In late November 2025 the Danish EU Council Presidency presented a compromise that moved the Council position from a previously contemplated mandatory scanning regime toward a voluntary framework in which individual Member States may decide whether to implement scanning measures; the text therefore no longer forces platforms at EU level to break encryption or universally scan all private messages [1] [2]. Reporting and Council materials describe this as a “softer” model that avoids an explicit requirement to break end‑to‑end encryption, although it leaves open national approaches [7] [2].
3. Do the available texts require platforms to retain deleted or deactivated content?
Available sources in the provided set do not cite an explicit, binding EU rule forcing platforms to retain deleted or deactivated user content solely for scanning. The debate documented in European Parliament questions and civil‑society commentary focuses on mass scanning, record‑keeping concerns and the risk of surveillance rather than quoting a clear retention clause in the Council compromise text [3] [4] [5]. The sources show worries that obligations could create de facto retention or record‑keeping pressures, but they do not present a concrete legislative provision that mandates retention of deleted or deactivated content across the EU [3] [5].
4. Where the retention worry comes from — implicit pressures and national options
Opponents’ warnings stem from two dynamics: first, detection regimes (especially client‑side or provider scans) can create technical and legal incentives to store flagged material or metadata for investigation, effectively producing retention in practice; second, the Council compromise’s shift to “voluntary” scanning for Member States leaves room for national laws to require different procedures, including retention, even if the EU text itself is presented as softer [7] [1] [2]. Thus, the retention concern is less about a single explicit EU mandate (not shown in current reporting) and more about indirect or national obligations that could follow [5] [6].
5. Competing viewpoints and who’s sounding the alarm
Digital‑rights groups (for example EFF and the German Chaos Computer Club) and many MEPs frame the proposal as enabling mass surveillance and record‑keeping that would contravene European Court of Justice precedents and fundamental‑rights protections [8] [5] [6]. Council and some member‑state negotiators argue the compromise balances child‑protection goals with encryption safeguards by making scanning optional and avoiding an outright requirement to break encryption [1] [2]. Both positions are present in the sources: civil society stresses long‑term risks and precedents; the Council texts claim a narrower, voluntary approach.
6. Practical implications for users and platforms today
Given the current reporting, platforms are not yet uniformly compelled at EU level to retain deleted or deactivated content for mandated scanning; however, the political compromise leaves legal uncertainty and national divergence likely, which could produce retention obligations in certain Member States or through implementing rules [1] [2] [7]. Tech companies and privacy advocates already warn that any scanning regime—even voluntary or conditional—can create market pressures and technical workarounds that erode end‑to‑end protections over time [6] [7].
7. Bottom line and what to watch next
The most concrete, recent change in the record is the Danish Presidency’s move to voluntary scanning in the Council compromise [1] [2]. Available sources do not show a final EU regulation that explicitly forces platforms to keep deleted or deactivated user content for scanning; they do document sustained fears that either EU‑level or national follow‑up measures, implementation practices, or market pressures could lead to effective retention or record‑keeping in practice [3] [5] [7]. Watch the coming trilogue negotiations and final drafting: that is where specific retention clauses or national opt‑ins could either be written into law or left out [9] [10].