How long does discord retain archived accounts reported to ncmec without an investigation?

Checked on December 21, 2025
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

This fact-check may be outdated. Consider refreshing it to get the most current information.

Executive summary

Discord’s public safety documentation and transparency reports say accounts tied to child sexual abuse material (CSAM) are immediately disabled, reported to the National Center for Missing & Exploited Children (NCMEC), and removed from the service [1]. However, none of the provided Discord materials or reporting reviewed here specify a fixed retention period for “archived” accounts reported to NCMEC that are not followed by an active criminal investigation; Discord has even told French authorities it has no written data-retention policy [2].

1. Discord’s stated workflow for child-safety reports

Discord makes clear that when its Trust & Safety team identifies potential child-safety concerns, the company conducts a review, disables offending accounts in many high‑harm categories, and files Cybertip reports with NCMEC where appropriate [3] [1]. For CSAM specifically, Discord says it “immediately disable[s] and report[s] the account to NCMEC and remove[s] the content,” signaling an operational rule to take rapid removal and reporting actions on discovery [1].

2. Public statements that suggest permanence for CSAM offenders

Multiple sources emphasize a zero‑tolerance approach: Discord and testimony before the Senate Judiciary Committee state that users who upload CSAM are permanently banned from the service and reported to NCMEC, which then refers cases to law enforcement [4] [1]. Transparency reporting and third‑party coverage document large volumes of account reports to NCMEC and frequent use of PhotoDNA to flag media [5] [1].

3. What the sources do not say — the retention gap

None of the reviewed Discord pages or transparency reports specify how long Discord retains account records or “archives” of accounts after they are disabled and reported, particularly in cases that do not progress into formal criminal investigations; the public materials describe actions taken (disable, report, remove content) but do not provide a timeline for retention of account data post‑reporting [3] [1]. The absence of a stated retention window is corroborated by a French regulatory finding in which Discord told authorities it had no written data‑retention policy, a disclosure that surfaced during a CNIL probe into GDPR compliance [2].

4. Regulatory context and why retention transparency matters

Retention practices intersect with legal obligations and oversight: NCMEC accepts reports and refers matters to law enforcement, but NCMEC’s role does not itself set public timelines for how long platforms must keep archived data pending investigation [3]. Meanwhile, regulators like CNIL scrutinize platforms for unspecified retention periods and have fined Discord for GDPR breaches tied to retention and security, indicating that regulators expect documented policies and limits [2]. The public record here therefore shows a tension: Discord describes decisive removal and reporting actions for abuse but has not publicly committed to a concrete retention timeframe for reported but not‑investigated accounts.

5. Practical takeaway and open questions

On the evidence provided, the direct answer is: there is no publicly stated duration for how long Discord retains archived accounts reported to NCMEC when no investigation follows; Discord documents immediate disabling and reporting for CSAM and asserts permanent bans for those offenders, but does not publish a retention schedule for archived account data and has previously told regulators it lacked a written retention policy [1] [4] [2]. That gap creates legitimate questions about how long account records (metadata, backups, or content copies) persist after reporting, what legal holds might extend retention, and whether Discord’s internal retention varies by region or category of report; the provided sources do not supply answers to those specifics [3] [2].

Want to dive deeper?
What legal requirements govern how long online platforms must retain user data after reporting CSAM to NCMEC?
Did the CNIL fine against Discord require specific changes to its data‑retention practices, and what were they?
How does NCMEC handle platform reports that do not result in law‑enforcement investigations?