What federal and state prosecutions of AI‑generated CSAM occurred between 2023–2025 and where can their court records be found?
Executive summary
Between 2023 and 2025 a small but consequential set of prosecutions tied to AI‑generated or AI‑modified child sexual abuse material (CSAM) reached federal and state courts: high‑profile federal matters include a 2023 prosecution referenced as U.S. v. Tatum and a 2025 guilty plea by Justin Ryan Culmo involving thousands of AI‑generated images, while reporting also identifies at least two other prosecutions and one contested district‑court ruling in 2025 about possession charges . Public records for these matters exist in the courts that handled them and are referenced in government press releases and news reporting, though the available sources do not provide a single consolidated docket list .
1. Known federal prosecutions and pleas: who was charged and what reporting shows
Federal prosecutions tied explicitly to AI‑generated or AI‑modified CSAM that are documented in the reporting include U.S. v. Tatum , cited in law‑industry summaries as an example where prosecutors argued AI‑modified imagery based on real victims could be charged under existing statutes , and the February 2025 guilty plea of Justin Ryan Culmo, whose case was publicized by ICE/Homeland Security Investigations and which the agency reported involved possession of roughly 8,500 AI‑generated images among a larger cache of illicit material . Reporting also notes a November 2023 federal conviction of a Pittsburgh sex offender for possession of modified CSAM of child celebrities and a separate November 2023 Charlotte case in which a child psychiatrist was sentenced after using web‑based AI to alter images of clothed minors into CSAM—these incidents were described in federal agency alerts and news summaries as prosecutions involving AI‑altered material .
2. State prosecutions and contested rulings: limited but growing enforcement and a pivotal Wisconsin decision
State prosecutions are less comprehensively catalogued in the provided reporting, but multiple state legislative changes and prosecutions are noted: several states have enacted statutes criminalizing AI‑generated CSAM and state prosecutors are increasingly empowered to bring charges under those laws . Separately, a U.S. district court in Wisconsin in early 2025 tossed a possession charge tied to AI‑generated CSAM on constitutional grounds while allowing other charges to proceed, and federal prosecutors have appealed that ruling—reporting frames that decision as potentially pivotal for how private possession and production of AI imagery will be treated going forward .
3. Where the court records and official filings can be located according to reporting
Public-facing sources identify the cases by defendant and agency so readers and researchers can locate court dockets: the ICE press release names Justin Ryan Culmo and summarizes the charges, which provides the identifying details necessary to retrieve filings from the responsible court ; reporting on the Wisconsin district‑court ruling and subsequent appeal likewise identifies the case and the judge’s order in coverage that points to a federal district court docket for the opinion and appeal documents . The federal convictions and sentences reported in agency and news accounts—such as the November 2023 Pittsburgh conviction and the Charlotte sentencing—are cited in government advisories and news articles that name the defendants and sentences, which can be used to find underlying court records . The sources do not publish consolidated links to every docket.
4. Legal contours, competing narratives, and gaps in public reporting
The reporting shows a legal landscape in flux: prosecutors have used existing federal statutes and obscenity law to bring charges where imagery is photorealistic or adapted from real victims, states have moved to criminalize AI‑created materials, and at least one federal judge has signaled First Amendment limits on possession charges—creating conflicting threads that will be resolved in appellate courts and future cases . At the same time, the available reporting does not produce a comprehensive, court‑by‑court catalogue of every prosecution from 2023–2025, and some legal analyses disagree about whether AI‑only imagery without a real identifiable child will always be prosecutable—coverage cites both prosecutorial successes and judicial pushback .