How does EU Chat Control treat content from accounts that have been deleted or deactivated on services like Discord?
Executive summary
The available sources describe EU “Chat Control” as a proposal to make platforms scan private communications for child sexual abuse material (CSAM) and that lawmakers have repeatedly debated, altered and sometimes removed the most intrusive elements (for example, mandatory client-side scanning was reportedly removed from the Council position) [1][2]. None of the supplied documents explicitly addresses how the proposal would treat messages or accounts that have been deleted or deactivated on platforms such as Discord; available sources do not mention deleted or deactivated accounts [3][4].
1. What the Chat Control debate is actually about — a quick primer
The EU debate branded “Chat Control” centers on a draft regulation to combat CSAM by mandating detection and reporting measures inside private communications; critics say this amounts to mass scanning and endangers encryption and privacy, while proponents frame it as child-protection policy [5][1]. Over several iterations the proposal has been watered down by member-state resistance and political pushback — for example, the most contested requirement to force scanning of end‑to‑end encrypted messages was removed from the Council’s position, according to reporting by digital-rights groups [2][1].
2. Legislative flux matters — why the text you ask about may not yet exist
Multiple sources emphasise that the proposal has been rewritten repeatedly, that Council and Parliament positions differ, and that trilogue negotiations were expected to resume in late 2025 or early 2026 — so precise operational rules (such as treatment of deleted accounts) might depend on final text produced in those negotiations [4][6]. Because the law’s scope and technical obligations shifted across versions, procedural details often remain unspecified in public summaries and advocacy materials [4][5].
3. What the public sources say about the core technical approaches
Reporting and advocacy documents repeatedly flag client‑side scanning (analysis on the user’s device before encryption) and server‑side scanning as the principal technical mechanisms under discussion; those mechanisms determine when and where content is inspected, retained, or reported [7][6]. But the public summaries do not break down downstream retention, deletion, or account-status edge cases — such as whether a platform must scan content from accounts that were deleted before scanning, or how deactivated accounts are treated [6][7].
4. Where the silence is significant — deleted/deactivated accounts are not discussed
On the specific point you asked — how the regulation treats content from deleted or deactivated accounts on services like Discord — the sources do not provide explicit guidance or examples. Advocacy pages, parliamentary questions, and explainers focus on whether scanning is allowed or mandatory and on encryption implications, not on lifecycle rules for user accounts or post‑deletion data flows; therefore “not found in current reporting” is the only accurate statement from available material [8][5].
5. Why that omission matters in practice
If a law mandates scanning at different architectural points (client, transit, server), the legal duty to scan or retain evidence would interact with platform data‑retention and account‑deletion policies. Transparency advocates warn such requirements could compel services to hold or reconstruct material they would otherwise remove — an issue widely discussed in these sources — but none of the provided reports specifies how deleted or deactivated accounts are to be handled under the regulation [5][1].
6. Conflicting perspectives and implicit agendas to watch
Digital‑rights groups (EDRi, EFF, campaign sites) present the regulation as an existential privacy threat and focus on technical overreach such as client‑side scanning; industry and some governments argue these tools are needed to protect children and that the law can be tailored to respect encryption [5][1]. Be aware that advocacy sites emphasise worst‑case technical scenarios to mobilise public opposition, while government summaries emphasise compromise and protection goals — both positions are visible in the supplied sources [2][3].
7. What you can do next to close the gap
To obtain a definitive answer on deleted/deactivated accounts, consult the latest draft text of the regulation or the Commission/Council/Parliament position papers and technical annexes once published; the cited materials show that precise operational rules emerged late in negotiations [4][6]. In the absence of a final text, any definitive operational claim about deleted or deactivated accounts would be unsupported by the available reporting [4][5].