Keep Factually independent

Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.

Loading...Time left: ...
Loading...Goal: $500

Fact check: How are artists able to protect their works from the ever-evolving AI models that trains off from them without consent and permission?

Checked on January 16, 2025

1. Summary of the results

Artists currently have both technological and legal options to protect their work from unauthorized AI training, though these solutions are not perfect. Technological solutions include:

  • Using watermarks and digital signatures
  • Opting out of AI training platforms
  • Implementing image-cloaking tools like Glaze and Nightshade [1]

However, these technological defenses are temporary and can potentially be bypassed by sophisticated techniques [2].

2. Missing context/alternative viewpoints

The original question overlooks several crucial aspects:

*Legal Developments:

  • A significant class-action lawsuit is currently proceeding in California against four AI companies, involving 10 visual artists [3]
  • Notable artists like Kelly McKernan and Karla Ortiz are actively fighting against companies like Stability AI, Midjourney, and DeviantArt [4]
  • Adobe has proposed the Federal Anti-Impersonation Right (FAIR Act) to create legal mechanisms against commercial impersonation of artistic styles [5]

Fundamental Challenges:

  • The issue extends beyond just protection, involving:
  • Dilution of originality
  • Income displacement
  • Copyright ownership disputes
  • Legislative lag
  • Potential distortion of artistic perception [1]
  • The Copyright Office maintains that human authorship remains crucial, requiring artists to disclose their AI tool usage for copyright protection [6]

**3. Potential misinformation/bias in the original statement**

The question assumes artists are purely defensive, when in reality they are taking proactive measures:

  • Artists are actively engaging in legal battles, with a class-action lawsuit demanding permission, compensation, and consent [7]
  • The scale of the problem is larger than implied - the LAION dataset alone contains 5 billion images being used for AI training [3]

Stakeholder interests:*

  • AI companies benefit from unrestricted access to artistic works
  • Artists seek both protection and compensation
  • Technology companies like Adobe are positioning themselves as mediators by proposing legislation [5]
  • Legal firms benefit from the ongoing litigation between artists and AI companies
Want to dive deeper?
Jamal Roberts gave away his winnings to an elementary school.
Did a theater ceiling really collapse in the filming of the latest Final Destination?
Is Rachel Zegler suing South Park?