What specific data retention obligations do UK and EU age‑verification laws impose on platforms like OnlyFans?

Checked on January 16, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

UK and EU age‑verification rules do not mandate a single retention schedule but consistently require strict data‑minimisation, purpose limitation and avoidance of unnecessary storage: platforms must design age checks to verify age without creating long‑lived identity stores, and regulators (Ofcom/ICO in the UK and the EDPB and Commission in the EU) have urged or proposed technical models that avoid retaining personally identifiable data after verification [1] [2] [3]. Enforcement complements these privacy constraints with heavy fines for non‑compliance, while industry providers claim “no retention” practices—claims that have become focal points for privacy debate and watchdog scrutiny [4] [1].

1. Legal framework and the core obligation: minimise what you keep

Both UK and EU frameworks subordinate age‑verification to existing data‑protection principles: platforms must adopt privacy‑preserving, data‑minimising methods and retain only what is strictly necessary to prove age or to meet demonstrable compliance requirements, rather than keeping identity data by default [1] [3]. The EU’s work on a harmonised, privacy‑preserving age verification approach and the EDPB guidance emphasize that age‑assurance must be proportionate, auditable and minimise personal data collection [2] [3]. In the UK, Ofcom’s codes and ICO guidance require technical accuracy while warning platforms against storing unnecessary personal data [5] [1].

2. Practical consequence for platforms like OnlyFans: no free‑for‑all data lakes

For subscription adult platforms that rely on age checks, the net result is that they cannot lawfully build a permanent identity repository simply as a byproduct of AV: the stated regulatory design prefers techniques that confirm “over‑18” status without unnecessary retention of the underlying ID or biometric templates, and Ofcom/ICO guidance recommends design choices that purge or avoid storing identifying information beyond what the law requires [1] [6]. Industry vendors likewise pitch reusable tokens or selective disclosure systems so platforms can rely on a proof without holding the underlying ID [7] [3].

3. What “no retention” or “minimal retention” means in practice — ambiguity remains

Although age‑verification firms and some reporting assert that providers “don’t retain data,” real‑world implementations vary: some systems issue validation tokens after an initial ID or selfie check and claim not to keep raw ID images, while others may log verification timestamps or hashed identifiers for audit and fraud‑prevention purposes—areas where EU guidance demands proportionality and auditability but leaves detailed retention rules to data‑protection law and sectoral codes [4] [3] [2]. Civil liberties groups and the Open Rights Group warn that leaving technical and retention choices to vendors risks inconsistent privacy protections and creates regulatory gaps [8].

4. Enforcement, penalties and operational constraints

Regulators have teeth: Ofcom can fine up to £18 million or 10% of global turnover for breaches of Online Safety Act duties, and the DSA/EDPB regime links age‑verification obligations to broader compliance and risk‑mitigation duties for very large platforms—creating a compliance incentive to follow conservative retention policies [4] [3]. At the same time, EDPB and Commission blueprints for privacy‑preserving tools (like the EU “mini‑wallet” / temporary AV app) indicate regulators prefer technical architectures that minimise data retention rather than rely on long retention to satisfy audits [3] [2].

5. Areas of dispute and limits of available reporting

Disputes center on how narrowly “necessary” should be defined: privacy advocates argue for near‑zero retention of identifying data, while platforms and AV vendors stress the need for limited logs to combat fraud and meet audit obligations; existing sources document both regulator preferences for minimisation and vendor claims of no retention, but they do not produce a single, binding retention timetable applicable to every platform or implementation [1] [4] [8]. Reporting also shows that EU pilot projects and UK codes aim to standardise privacy‑preserving designs, but final, uniform retention rules—especially for third‑party processors versus platform controllers—remain subject to technical guidance and case‑by‑case data‑protection assessments [2] [3].

Want to dive deeper?
How do EU Digital Services Act audits assess platform logs from age‑verification systems?
What technical architectures (zero‑knowledge proofs, tokens, mini‑wallets) let platforms verify age without storing IDs?
How have privacy regulators (ICO, EDPB) enforced retention breaches in age‑verification cases?