Index/Topics/Child sexual abuse material detection

Child sexual abuse material detection

The process of detecting and removing child sexual abuse material from online platforms using image hashes.

Fact-Checks

5 results
Jan 28, 2026
Most Viewed

How are online platforms obligated to respond when users are reported for CSAM?

Online platforms in the are legally required to report apparent child sexual abuse material () to the CyberTipline once they obtain actual knowledge, and recent legislation has expanded what must be r...

Jan 31, 2026
Most Viewed

What specific technical models for client‑side scanning (CSS) have been proposed and what are their documented security risks?

schemes proposed in recent years fall into a small set of technical models—app‑level hash matching, OS‑level scanning, perceptual‑hash plus ML classifiers, and “narrowed” or voluntary implementations—...

Feb 4, 2026

When a NCMCE report is filed on an image, is the original image shared? If not, how does law enforcement obtain the image?

When a report is filed into ’s CyberTipline or entered into its hash‑sharing systems, the organization’s public documentation and partner guidance show that what is routinely distributed to online pla...

Jan 18, 2026

How do automated hash‑matching systems work and what are their limits in identifying CSAM?

Automated hash‑matching systems detect known child sexual abuse material (CSAM) by converting media into compact "hashes" and comparing them to databases of verified CSAM hashes, enabling fast, scalab...

Jan 14, 2026

How have deepfakes been used in medical misinformation campaigns and what forensic tools detect them?

Deepfakes have moved from novelty to a weapon in health disinformation campaigns, impersonating clinicians, fabricating endorsements for supplements, and even altering medical images—tactics documente...