Does the latest draft of the EUs CSAM proposal ("chat control") include ID verification?

Checked on January 15, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

The latest available reporting indicates that the EU’s draft Child Sexual Abuse Regulation (the so-called “Chat Control” file) has moved away from breaking end‑to‑end encryption in some versions but still contains provisions requiring age verification or age‑assurance measures in key drafts and Council texts — a requirement described as “mandatory” in multiple news and expert summaries [1] [2] [3]. The legislative package is contested and evolving, so claims about a single permanent text must be read as tied to the particular draft under discussion and the stage of inter‑institutional negotiation [4] [5].

1. The regulatory landscape: multiple actors, multiple drafts

The proposal originated with the European Commission in May 2022 to create a regulation obliging providers to detect and report child sexual abuse content and related grooming, and it envisaged an EU Centre and detection obligations that could include known and new CSAM [6] [7]. Since then the Council, Parliament and successive Presidencies have produced competing compromise texts and positions: some Council drafts narrowed mandatory content scanning, some Parliament reports added other measures, and presidencies such as Denmark have repeatedly revised compromise texts — meaning there is no single “final” draft yet but a string of related versions that differ on scope and safeguards [4] [5].

2. Age verification appears explicitly in several later drafts and critiques

Multiple independent sources and expert commentary identify mandatory age verification or age‑assurance as part of later Council or compromise drafts: an analysis quotes the Council text as “introduc[ing] mandatory age verification for users” in contexts such as downloading or accessing services deemed high‑risk for CSAM, and describes age checks for specific features or apps [1]. Industry and researcher objections similarly state that “the proposal includes mandatory age verification and assessment measures,” arguing these could be evaded and would have privacy and exclusionary consequences [2] [8].

3. Encryption and scanning: rollback on client‑side scanning, but age checks survive

Reporting documents a partial retreat from earlier drafts that would have forced decryption or mandated client‑side scanning of end‑to‑end encrypted content; some later Council positions and the Danish Presidency’s compromise removed compulsory interception of encrypted content while keeping other obligations for providers and introducing age verification in certain cases [1] [3] [9]. Civil‑society and technical experts see that as a trade‑off: encryption‑breaking provisions have been scaled back in some drafts, but mandatory age verification is framed as an alternative lever to protect minors [3] [9].

4. What “mandatory age verification” means in practice — contested and technically fraught

The draft language reported in a number of assessments suggests age verification would apply when users download “high‑risk” apps or access specific features and could be positioned as an obligation on providers to implement privacy‑preserving age assurance systems; critics counter that such measures can be bypassed, undermine anonymity and exclude minors from services, and that the technology and proportionality of enforcement remain unresolved [1] [2] [8]. Official documents also frame age assurance as part of a broader risk‑assessment regime for providers and tie it to other safeguards and to the EU Centre’s role — but operational details are left to later implementing rules [6] [5].

5. Bottom line and limits of reporting

Based on the supplied sources, the most recent prominent drafts and council compromise texts reported by experts and outlets do include mandatory age‑verification/age‑assurance clauses or provisions that would require providers to implement such checks in specified circumstances [1] [2] [3]. However, the package is still in negotiation across EU institutions, with significant disagreement about scope, technical feasibility and rights safeguards [4] [5], and the supplied material does not include a single consolidated final text adopted by all co‑legislators — therefore the presence, scope and form of “ID verification” in the final law may still change as trilogue talks continue [4].

Want to dive deeper?
What specific Council and Parliament compromise texts contain the age verification clauses in the CSAM proposal, and where can the exact wording be found?
How would different age‑verification technologies (document checks, third‑party age‑services, device‑based attestation) work technically and what are their privacy risks?
Which EU bodies (EDPS, EDPB, national data protection authorities) have issued legal opinions on mandatory age verification in the CSAM regulation and what were their conclusions?