A class action lawsuit filed by artists who allege that Stability, Runway and DeviantArt illegally trained their AIs on copyrighted works can move forward, but only in part, the presiding judge decided on Monday. In a mixed ruling, several of the plaintiffs’ claims were dismissed while others survived, meaning the suit could end up at trial. That’s bad news for the AI makers: Even if they win, it’s a costly, drawn-out process where a lot of dirty laundry will be put on display. And they aren’t the only companies fighting off copyright claims — not by a long shot.
The Lawsuit’s Core Issues
The lawsuit, filed in January 2023, centers on the use of copyrighted images to train AI models without the consent or compensation of the original artists. The plaintiffs argue that this practice constitutes copyright infringement and violates their rights as creators.
Key points of the lawsuit include:
- Alleged unauthorized use of millions of copyrighted images
- Lack of proper attribution or compensation for artists
- Potential devaluation of human-created art
- Questions about the legality of AI training methods
Judge’s Mixed Ruling
U.S. District Judge William Orrick’s decision allows some claims to proceed while dismissing others:
- Dismissed: Claims related to vicarious copyright infringement and unfair competition
- Proceeding: Direct copyright infringement claims against Stability AI and Midjourney
- Proceeding: Claims of violation of Digital Millennium Copyright Act (DMCA)
This mixed ruling means that while some aspects of the case have been narrowed, the core issues regarding copyright infringement in AI training will be examined in court.
Implications for the AI Industry
The decision to allow parts of the lawsuit to proceed has significant implications for the AI industry:
- Legal Precedent: This case could set a crucial precedent for how copyright law applies to AI training data.
- Transparency in AI Development: Companies may be forced to disclose their training methods and data sources, potentially revealing closely guarded trade secrets.
- Financial Risk: Even if they ultimately win, AI companies face substantial legal costs and potential damages.
- Industry-Wide Impact: Other AI companies using similar training methods may face similar lawsuits or need to reevaluate their practices.
Broader Context: AI and Copyright
This lawsuit is part of a larger debate about AI and copyright:
- Getty Images vs. Stability AI: Getty has sued Stability AI for allegedly scraping millions of images from its database.
- Authors vs. OpenAI: Several authors have filed a class action lawsuit against OpenAI for using their books to train ChatGPT without permission.
- Music Industry Concerns: The music industry is grappling with similar issues as AI-generated music becomes more sophisticated.
Potential Outcomes and Industry Response
As the case proceeds, several outcomes are possible:
- Settlement: Companies might opt to settle out of court to avoid prolonged legal battles and negative publicity.
- New Licensing Models: The industry might develop new ways to license and compensate artists for use of their work in AI training.
- Regulatory Intervention: Governments might step in to create new regulations governing AI and copyright.
- Technical Solutions: AI companies might develop new methods to train their models that avoid copyright issues.
Conclusion
The judge’s decision to allow parts of the lawsuit to proceed marks a significant moment in the ongoing debate about AI and copyright. As the case moves forward, it will likely shape the future of AI development and the rights of artists in the digital age. Regardless of the outcome, this lawsuit underscores the need for clearer guidelines and ethical standards in the rapidly evolving field of artificial intelligence.