Do I report lolicon and shotacon to ncmec?

Checked on January 15, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

The National Center for Missing & Exploited Children (NCMEC) operates the CyberTipline as the United States’ centralized reporting system for suspected online child sexual exploitation and related material, and it accepts reports from the public about images and content that depict or exploit children [1] [2]. If an image or file—whether a photograph, video, or a digital/graphic depiction—appears to sexualize or depict minors, submitting a report to the CyberTipline or contacting NCMEC’s hotline is within the organization’s stated remit [2] [3].

1. What NCMEC says it does and how reports are handled

NCMEC describes itself as the nation’s nonprofit clearinghouse for prevention of and recovery from child victimization and operates a 24-hour call center and the CyberTipline to collect reports of online enticement, child sexual abuse material, and related offenses, then reviews and shares those reports with appropriate law enforcement or Internet Crimes Against Children task forces [4] [1] [5]. The CyberTipline explicitly lists categories such as child pornography, online enticement, unsolicited obscene materials sent to a child, and misleading digital images on the internet as reportable items [2] [3].

2. Where the public’s reports fit — what the CyberTipline accepts

The public and electronic service providers can make reports to the CyberTipline about suspected online sexual exploitation, including images and “misleading words or digital images” on the internet, and NCMEC’s resources emphasize that broadly defined exploitative material is what they process and forward to law enforcement [2] [3]. In recent federal updates, reporting obligations and data retention rules for CyberTipline material were expanded and clarified by Congress, showing that the system’s scope has been strengthened in law and practice [5].

3. Applying that to “lolicon” and “shotacon” content — the practical test

NCMEC’s public guidance focuses on whether material involves the sexual exploitation or sexualization of children and whether it constitutes child sexual abuse material or related online offenses, which suggests the practical criterion for reporting is whether the content depicts or targets minors for sexual purposes [2] [1]. The reporting system’s stated categories explicitly include “misleading words or digital images,” which the organization uses to capture online depictions that may be exploitative or used to groom or victimize children [2].

4. Legal nuance and limits of the reporting guidance in the available sources

The sources supplied outline what NCMEC collects and forwards to law enforcement but do not authoritatively define how drawn, animated, or fictional depictions—commonly referred to as “lolicon” or “shotacon”—are treated under federal or state criminal law, nor do they state a blanket rule for every form of non-photographic sexualized content [5] [2]. Therefore, while the CyberTipline accepts reports of “digital images” and misleading images [2], these documents do not resolve legal distinctions between photographic child sexual abuse material and stylized or fictional depictions; that distinction varies by statute and prosecutorial practice, which is beyond what these NCMEC sources establish [5].

5. Clear, practical guidance based on NCMEC’s remit

Given NCMEC’s mission and the CyberTipline’s stated remit to receive reports from the public about online sexual exploitation and problematic images, reporting content that appears to sexualize minors to the CyberTipline or the 24‑hour call center is consistent with their role as the centralized clearinghouse [1] [3] [2]. NCMEC also offers resources for victims and families and a “Take It Down” program for images, indicating support for those harmed by exploitative content [6] [4]. Where uncertainty exists about legality or the nature of images, the organization is positioned to review and, if appropriate, pass the matter to law enforcement [5] [2].

6. Alternative viewpoints and institutional context

Advocates and some legal scholars debate how to treat fictional sexualized depictions and the balance between enforcement and free expression, but the provided NCMEC materials make clear only that the center accepts reports of digital images and forwards suspected child sexual exploitation to investigators without detailing how every borderline case is adjudicated [5] [2]. The organization’s partnerships with law enforcement and electronic service providers reflect an institutional agenda to maximize detection and reporting of material seen as harmful to children [1] [7].

Want to dive deeper?
How do U.S. federal and state laws differ in treating explicit drawn or animated depictions of minors?
What evidence exists about whether fictional sexualized imagery correlates with contact offenses against children?
How does NCMEC's 'Take It Down' service work and who is eligible to request content removal?