How does the Lanham Act apply to deepfaked celebrity endorsements of health products?

Checked on February 1, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

The Lanham Act’s Section 43(a) — its false endorsement/false designation of origin provisions — is the primary federal tool for celebrities to challenge AI-generated or “deepfaked” endorsements that commercially imply their approval of a product, including health products, because it targets misleading commercial associations that cause consumer confusion [1] [2] [3]. Courts and commentators caution, however, that success depends on proving likelihood of consumer confusion, commercial use, and overcoming First Amendment and disclaimer defenses, and that other remedies (state publicity laws, defamation) often run in parallel [4] [5] [6].

1. What the Lanham Act actually forbids and why that matters for deepfakes

Section 43(a)(A) bars false designations of origin and false endorsements — essentially commercial misrepresentations that create a likelihood that consumers will believe a person or brand endorses or is affiliated with the goods or services — making it directly relevant when a deepfake places a celebrity’s image, voice, or persona alongside a product pitch [2] [3]. Legal practitioners and law-review scholars have argued that the Act is “well-suited” to deepfakes because it targets the principal mischief of false association and allows federal claims beyond the patchwork of state publicity statutes [7] [5].

2. The elements plaintiffs must prove and common defenses defendants raise

To prevail a plaintiff typically must show commercial use, a false or misleading representation, and a likelihood of consumer confusion that the celebrity endorsed the product; courts assess whether the deepfake is likely to cause reasonable consumers to believe in an endorsement or sponsorship [4] [2]. Defendants can and do invoke First Amendment protections for expressive uses (parody, satire), argue that disclaimers defeat confusion, or contend that the deepfake is noncommercial or not materially misleading — defenses that scholars and firms note can blunt Lanham Act claims, especially where dignity harms exist without market confusion [4] [5] [8].

3. Why health products raise special stakes under the Act

Health products are particularly sensitive because allegedly false endorsements about efficacy or safety are material to purchasing decisions, elevating the likelihood-of-confusion analysis and the potential for Lanham Act false advertising claims in addition to false endorsement claims [8] [3]. Regulators and commentators warn that deepfaked celebrity ads for health services or insurance have already been used in scams, illustrating both the consumer harm and the commercial character that strengthens a Lanham Act theory [9] [10].

4. Procedural avenues and strategic advantages of a Lanham Act claim

A federal Lanham Act suit can yield injunctive relief, damages, and the ability to pursue contributory liability against platforms or marketers that facilitate distribution; commentators highlight that trademark-based claims can circumvent limits of state publicity law and that Section 230 does not categorically shield websites from intellectual-property-based claims [7] [4]. This federal avenue explains why some public figures pursue trademark or persona protections and why plaintiffs sometimes couple Lanham Act suits with right-of-publicity or state law claims to box in defendants [11] [12].

5. Limits, open questions, and evolving legal terrain

Scholars and firms emphasize unresolved questions: how courts will treat subtle AI-generated resemblances versus obvious impersonations, when disclaimers suffice, and how First Amendment expressive-use doctrines will be applied to commercial deepfakes; no settled federal rule fully resolves these tensions and emerging legislation and trademark filings are reshaping the field [4] [5] [11]. Reporting and law review work stress that while the Lanham Act is powerful, it is not a catchall — outcomes will turn on fact-specific consumer-confusion analyses and on how courts weigh commercial harm against free expression [2] [6].

6. Practical takeaways for celebrities, platforms, and consumers

For celebrities and their lawyers, the Lanham Act offers a pragmatic federal route to stop and remedy commercial deepfakes of endorsements, especially where consumer deception around health claims is likely; for platforms, the risk of contributory liability counsels for notice-and-takedown and transparent labeling; for consumers, the harms documented in reporting show why vigilance about celebrity health endorsements online matters [1] [7] [9]. The reporting reviewed also cautions that remedies will vary case by case and that complementary state publicity laws and potential federal statutes may change the enforcement landscape going forward [12] [11].

Want to dive deeper?
How have courts applied Section 43(a) of the Lanham Act in deepfake endorsement cases since 2022?
What are the differences between Lanham Act false endorsement claims and state right-of-publicity claims for celebrities?
How do First Amendment defenses succeed in commercial deepfake and parody cases involving health products?