How does the EU chat control regime define ‘existing data’ and what retention limits apply?

Checked on December 8, 2025
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

The EU’s so‑called “Chat Control” or Child Sexual Abuse Regulation (CSAR) has oscillated between mandatory, interception-style scanning and weaker “voluntary” or consent‑based models; recent Council negotiation outcomes removed the explicit mandatory interception/scanning requirement while leaving mechanisms that critics say could still pressure providers to scan communications (see Council compromise and EFF reporting) [1]. Parliamentary questions, civil‑liberties groups and tech trade sources document continuing disputes over whether the proposal forces client‑side scanning, retention of metadata or new databases — the texts and summaries cited show no single unified statutory definition of “existing data” or a clear universal retention limit in the publicly reported summaries provided here (available sources do not mention a single statutory definition of “existing data” or a single retention period across the regime) [2] [3] [4] [1].

1. What the controversy is about: mandatory scanning versus compromise

The fiercest debate has been whether the EU law would require providers to break or weaken end‑to‑end encryption by mandating pre‑encryption/client‑side scanning of private messages; EFF reports that the most controversial element — a forced requirement to scan encrypted messages — was taken out of the Council position, though the law still codifies a political stance in the Council that alarms digital‑rights groups [1]. Opponents (MEPs, EDRi, tech firms) call the regime “Chat Control” and argue it risks mass surveillance; supporters frame it as necessary to combat child sexual abuse material [5] [3].

2. Where “existing data” shows up in reporting — but not a single definition

Multiple stakeholder briefings and commentaries discuss “existing data” in relation to how providers must cooperate with authorities or preserve evidence, but the materials in this set do not quote a single legislative clause that defines “existing data” across the proposal. Parliamentary questions and activists raise concerns about record‑keeping and databases, and some summaries mention retention and exchange of data through a proposed EU centre, but none of the linked summaries provides the definitive statutory text defining “existing data” [2] [3] [6] [7]. Therefore, available sources do not mention a unified legal definition of “existing data” in the CSAR text as presented here [2] [3].

3. Retention limits: contested, unclear and politically sensitive

Reporting and NGO analysis highlight fears of new record‑keeping obligations and databases to facilitate detection and law‑enforcement access, but the sources provided do not specify explicit numeric retention periods or a single retention ceiling that applies across all providers under the proposal [7] [8] [4]. News coverage notes core shifts in October–November 2025 (removing mandatory scans) and mentions “databases” and coordination mechanisms, yet the publicly cited summaries do not reproduce concrete retention limits — so available sources do not mention a specific retention limit in current reporting [8] [7].

4. Competing perspectives and implicit agendas

Civil‑liberties groups (EDRi, Fight Chat Control, Patrick Breyer) consistently frame the proposal as a disguised mass‑surveillance and encryption‑weakening agenda; they stress that any clientele‑side or provider‑side scanning undermines encryption and creates security risk [9] [4] [5]. EU Council and some Member State summaries argue the compromise balances child protection and privacy, highlighting that the most controversial mandatory scan language was removed — critics call that political sleight‑of‑hand because other clauses or “risk mitigation” duties could indirectly compel scanning [1] [6] [4]. These competing framings reflect different institutional incentives: child‑safety advocacy and law‑enforcement pressures on one side, and privacy/security/market concerns on the other [1] [3].

5. What to watch next — text, amendments, and COREPER votes

The final legal effects depend on the exact legislative text and any implementing acts. COREPER and Council votes in late 2025 and subsequent Parliament scrutiny determined whether the compromise stays or is amended; reporting records a close COREPER approval and intense NGO scrutiny, suggesting concrete definitions and retention limits may appear only in the final consolidated text or delegated acts [8] [1]. Observers should monitor official EU legislative documents (COM proposals, Council compromise texts, Parliamentary amendments) for clauses on “existing data,” storage obligations, and precise retention timelines [2] [6].

Limitations: this analysis relies only on the provided documents; none of those documents in this set reproduces the exact statutory wording that defines “existing data” or prescribes explicit retention periods, so I do not assert the existence of a specific legal definition or numeric retention limit beyond what these sources report [2] [8] [1].

Want to dive deeper?
What types of communications are excluded from the EU chat control 'existing data' definition?
How do member states interpret retention limits for previously stored private messages under the chat control rules?
What legal tests determine when data becomes 'existing' versus 'newly generated' under EU law?
How does the EU chat control regime interact with national data retention and deletion obligations?
What remedies can users pursue if their 'existing data' is retained beyond prescribed limits?