State laws against NCII
Executive summary
State laws against nonconsensual intimate imagery (NCII) form a patchwork: many states criminalize or create civil liability for distributing intimate images without consent, but counts and scopes vary across trackers and legal guides [1][2]. Federal action has recently tightened the landscape—Congress created a federal civil cause of action in 2022 and passed the TAKE IT DOWN Act in 2025, which imposes platform notice-and-removal duties and criminal prohibitions for certain conduct [3][4].
1. How many states have NCII laws — the contradictory counts
Advocacy and civil-rights trackers report different totals because states vary in whether their laws are criminal statutes, civil remedies, or both; EPIC counted 34 states plus D.C. that have criminal or civil liability for sharing sexually explicit images without consent, while some law firms and guides state that as many as 49 states and D.C. have revenge-porn criminal statutes, reflecting differences in definitions and recent enactments [1][2].
2. What the state statutes commonly criminalize and where they differ
State laws typically make it illegal to distribute an image of an identifiable person in a sexual context when the image was expected to be private and the distributor knew or should have known there was no consent; penalties range from misdemeanors to felonies depending on language like intent to harm, age of the victim, or repeated conduct [5][1]. Some statutes explicitly exclude prior consensual creation or prior consensual sharing from proving consent, while others add elements like intent to harass, coerce, or sexually exploit [5].
3. Minors, child-pornography overlap, and special state rules
Where victims are under 18, most states treat the posting of intimate images as child sexual-abuse material and prosecute under strict CSAM statutes rather than “revenge porn” laws, producing uniformly severe penalties; states also differ on whether revenge‑porn statutes apply to images created by AI or deepfakes [2][6].
4. The federal layer: civil remedies, SHIELD, and TAKE IT DOWN
At the federal level, Congress created a private federal civil action for victims of disclosed intimate images in 2022 as part of VAWA reauthorization, while the 2019 SHIELD-related amendment criminalized certain image exploitation tied to stalking statutes—both additions gave victims federal pathways in select cases [3][7]. The TAKE IT DOWN Act, passed in 2025, added a nationwide framework requiring covered online platforms to implement a written notice-and-removal process and to remove nonconsensual intimate images or deepfakes within set timeframes while also criminalizing certain nonconsensual publication conduct [4][8][9].
5. What TAKE IT DOWN actually requires of platforms and users
Under TAKE IT DOWN, covered platforms—sites or apps that host user-generated content or regularly publish NCII—must create an accessible process allowing an identifiable person or authorized representative to submit a written notification with identification of the depiction, a good-faith statement it was nonconsensual, and contact info; platforms then must act “as soon as possible, but not later than 48 hours” upon a valid request and have until May 19, 2026, to implement the full process [4][9][8].
6. Gaps, state-federal friction, and continuing legislative work
TAKE IT DOWN does not uniformly preempt state laws; many states are extending statutes to cover AI-generated deepfakes or passing standalone laws (e.g., Louisiana) and trackers show ongoing state work on intimate deepfake bills, creating a layered regulatory map where federal baseline rights coexist with divergent state penalties and remedies [6][10]. Critics and supporters warned the federal law could still leave victims without speedy removal tools in practice and raised questions about enforcement, platform burden, and free‑speech tradeoffs—observations flagged by legal commentators and associations weighing benefits and shortcomings [11].
7. What to watch next
Expect states to keep refining definitions (consent, deepfake), expanding civil damages or criminal classifications, and coordinating with federal takedown rules; meanwhile, complementary bills like DEFIANCE (civil damages for creators of forged images) and state deepfake trackers signal that legislators view NCII as an evolving area where technology outpaces existing statutes [12][10].