Which U.S. states criminalize the mere creation or solicitation of AI‑generated non‑consensual sexual images?
Executive summary
Most U.S. statutes and the new federal TAKE IT DOWN Act criminalize publishing or distributing non‑consensual intimate imagery rather than the mere offline creation or private solicitation of AI‑generated sexual images, but a subset of states have adopted broader language or standalone laws that come closer to criminalizing creation or facilitating creation via AI tools [1] [2] [3].
1. Where federal law draws the line — publication, not private creation
The TAKE IT DOWN Act, enacted in May 2025, makes it a federal crime to “knowingly publish” intimate images — including digitally forged “deepfakes” — without consent and forces covered platforms to adopt notice‑and‑removal procedures by May 19, 2026, which signals federal focus on dissemination and platform responsibility rather than mere offline generation or private solicitation [1] [2] [4].
2. Most state laws mirror the federal focus on sharing and distribution
Many state statutes historically extended child‑sexual‑abuse and revenge‑porn rules to cover computer‑edited or digitally reproduced material, criminalizing the distribution or disclosure of non‑consensual intimate images while stopping short of criminalizing private creation; legal trackers and advocacy groups show dozens of states have enacted such measures primarily aimed at publication and distribution [3] [5] [6].
3. A smaller group of states goes further — examples in the reporting
Reporting and legal analyses identify several states whose statutes or recent bills explicitly reach conduct beyond mere posting: Texas adopted an AI law prohibiting development or distribution of an AI system whose “sole intent” is producing child pornography or sexually explicit deepfakes (effectively targeting creators and tools) [1]; Pennsylvania has built on earlier statutes to criminalize non‑consensual distribution of AI‑generated sexual imagery and recognizes disclaimers as a defense in limited circumstances [7]; Washington’s recent law was described as an outlier for criminalizing AI‑generated sexual images involving adults in ways most states had not yet adopted [8] [7]; California’s penal provisions and recent bills add AI‑generated intimate images to non‑consensual imagery bans and include an explicit statute criminalizing creation or sharing of sexually explicit deepfakes involving a real person [9] [10]; Louisiana enacted standalone deepfake and non‑consensual intimate image laws [11]. These sources together show certain states explicitly target creators, tools, or the act of producing synthetic sexual images in addition to distribution [1] [7] [11].
4. Important nuances: intent, identity and the word “publish”
Across jurisdictions the burden and reach differ: some statutes require proof of intent to harass, humiliate, or profit, others penalize any non‑consensual sharing, and several child‑protection laws criminalize fabricated CSAM whether or not the child is real — but many statutory texts turn on terms like “publish,” “distribute,” or “use an online service,” leaving a legal gap between private creation/solicitation and public dissemination [12] [3] [1].
5. What the reporting cannot conclusively map — no definitive state checklist in these sources
The provided reporting identifies representative states that have moved beyond publication bans, and it points to trackers and compilations for granular, state‑by‑state status, but it does not supply an exhaustive, authoritative list of every state that criminalizes mere creation or solicitation absent publication; specialized trackers (Orrick, Public Citizen) and state code reviews are necessary for a definitive inventory [6] [5].
6. Bottom line and where to look next
The prevailing legislative trend criminalizes publication and platform responsibility, while a narrower set of state laws and recent enactments in Texas, Pennsylvania, Washington, California, and Louisiana (as reported) extend liability toward creation, tools, or solicitation of AI‑generated non‑consensual sexual images — yet the precise boundaries vary by statute and many states still focus on distribution, not private generation [1] [7] [8] [9] [11]. For anyone needing a definitive list, consult up‑to‑date state trackers and statutory texts cited by Public Citizen and the AI Law Center to verify current language and effective dates [5] [6].