Wisconsin nonconsensual intimate imagry laws
Executive summary
Wisconsin criminalizes the capture, reproduction, possession, distribution, and exhibition of intimate representations made without the subject’s consent, and its statute expressly covers AI‑generated “synthetic intimate representations” that are realistic enough to depict a person (942.09) [1] [2]. Recent federal and state actions — including the federal Take It Down Act and Wisconsin’s moves against AI‑generated child sexual abuse material — add removal obligations for platforms and new criminal exposure for synthetic content [3] [4] [5].
1. What the statute actually says: scope and key definitions
Wisconsin’s criminal code section 942.09 defines an “intimate representation” framework that makes it a felony to capture an intimate image without consent where the subject had a reasonable expectation of privacy, and criminalizes making reproductions of, possessing, or distributing such images when the actor knows or has reason to know the image was captured without consent [6] [1]. The statute explicitly includes “synthetic intimate representation,” defined to mean technologically generated depictions using a person’s face, likeness, or other distinguishing characteristics that are so realistic a reasonable person would believe they show the identifiable person — meaning deepfakes fall within the statute’s reach [2].
2. Conduct covered and penalties: capture, reposting, and possession
Under the statute Wisconsin treats a range of conduct as criminal: the initial nonconsensual capture of intimate representations in situations of reasonable privacy; creating reproductions of intimate images known to have been captured without consent; and possessing, distributing, or exhibiting images known to be nonconsensual — all of which the legislature has framed as felonies when the actor knows or has reason to know about the lack of consent [6] [1]. Separate but related statutes prohibit secret observation or recording in places with a reasonable expectation of privacy (e.g., bathrooms, dressing rooms), which prosecutors can use alongside 942.09 in many cases [7] [8].
3. How consensual images that are later shared are treated
Wisconsin’s statute reaches not only surreptitious recordings but also images that were created consensually and later shared without permission: the law criminalizes distribution and exhibition of intimate representations when the distributor knows the image was obtained or shared without consent, reflecting the state’s recognition that consensual creation does not confer indefinite distribution rights [1]. Civil remedies also flow from the criminal definitions, meaning victims can pursue civil claims built on the same statutory language—a dual track the Wisconsin Bar has highlighted [1].
4. AI, deepfakes, and enforcement challenges
The statute’s explicit inclusion of “synthetic intimate representation” brings deepfakes squarely into Wisconsin law, but enforcement and evidentiary questions remain: prosecutors must show the representation is both realistic enough to be identified as the person and that the actor knew or should have known there was no consent in its creation or distribution [2] [6]. Meanwhile, national policy is moving toward platform accountability: the federal Take It Down Act mandates notice‑and‑removal procedures for covered platforms by May 19, 2026 and imposes criminal prohibitions on publishing certain intimate images and deepfakes, creating a compliance overlay that will affect how platforms handle Wisconsin incidents [3] [4].
5. Recent state action and open questions
Wisconsin’s attorney general joined a multistate push urging AI company xAI to stop enabling “undressing” prompts that generate nonconsensual intimate imagery, signaling prosecutorial concern about platform capabilities and the speed of AI abuse [9]. The state also updated laws to criminalize AI‑generated child sexual abuse material after a 2024 legislative bill, showing lawmakers are supplementing 942.09 with child‑protection statutes aimed at synthetic content [5]. What remains less clear from the record is how courts will interpret knowledge and reason‑to‑know standards for synthetic images and how victims will practically prove attribution and intent in complex AI cases; those are matters of prosecutorial policy and judicial interpretation not fully addressed in the statutory text reviewed here [2] [6].