Bartz v. Anthropic is a class action copyright infringement lawsuit filed in August 2024 in the Northern District of California, in which three authors — Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson — sued 📝Anthropic for downloading and using millions of copyrighted books from pirate libraries to train the 📝Large Language Models (LLMs) behind 📝Claude. The case resulted in a $1.5 billion settlement — the largest copyright settlement in American history.
Key Facts
- Case: Bartz v. Anthropic, N.D. Cal.
- Judge: William Alsup (Phillip Burton Federal Building, San Francisco)
- Filed: August 2024
- Plaintiffs: Andrea Bartz, Charles Graeber, Kirk Wallace Johnson
- Settlement: $1.5 billion
- Per-work payment: ~$3,100 (after deductions)
- Works covered: ~482,460 titles with ISBNs/ASINs registered at the U.S. Copyright Office
- Attorneys' fees: 25% ($375 million requested)
- Named plaintiff awards: $50,000 each
What Anthropic Did
Anthropic downloaded over 7 million copyrighted books from shadow libraries — Library Genesis (LibGen) and Pirate Library Mirror (PiLiMi) — and used them alongside legally purchased, scanned copies to train the language models behind Claude. The plaintiffs alleged this constituted mass copyright infringement.
The Split Ruling
In June 2025, Judge Alsup issued a pivotal split ruling on summary judgment:
- Fair use for legally acquired books: Training AI on books acquired through legal means was ruled "quintessentially transformative" and protected as fair use.
- No fair use for pirated books: Downloading and retaining pirated copies was ruled not fair use and constituted infringement.
This distinction — that how training data is acquired matters as much as what is done with it — became the legal foundation of the case. In August 2025, Judge Alsup certified the class action, defining the class as all beneficial or legal copyright owners of reproduction rights in books identified in the pirated datasets. At statutory damages of up to $150,000 per work, Anthropic faced theoretical liability exceeding $70 billion.
The Settlement
Anthropic agreed to pay $1.5 billion into a settlement fund. The default split is 50% to authors and 50% to publishers for each covered work. Settlement administrators located 66% of authors and 97% of publishers.
Timeline:
- September 2025 — Preliminary approval granted
- November 2025 — Direct notice mailing began
- January 7, 2026 — Opt-out deadline
- March 23, 2026 — Claims deadline
- April 2026 — Fairness hearing for final approval
What the Settlement Covers (and Doesn't)
Covered:
- Past acquisition, retention, and use of pirated works through August 25, 2025
- Anthropic must destroy all downloaded pirated works within 30 days of final judgment
Not covered:
- No future licensing scheme for AI training is created
- Claims based on AI output that might infringe copyrights are not released
- Anthropic's legal scanning of purchased books is not addressed
- No global regulatory precedent is set
Why It Matters
The case established that the 📝provenance of 📝Training Data is a legal boundary. 📝Fair Use protects transformative use of legally acquired material, but piracy is piracy regardless of downstream purpose. This has implications across the 📝Artificial Intelligence (AI) industry — every company training on web-scraped or shadow-library data now operates under the precedent that acquisition method is subject to scrutiny.
The settlement does not resolve the broader question of whether AI training on lawfully acquired copyrighted works requires licensing. That question remains open across multiple ongoing cases involving other AI companies.
Related
- 📝What Anthropic's Claude Mythos and My Divorce Have in Common — the pattern of agreements drifting at scale
- 📝Anthropic — the company
- 📝Claude — the model family trained on the works in question
References
- The Bartz v. Anthropic Settlement: Understanding America's Largest Copyright Settlement — Kluwer Copyright Blog
- What Authors Need to Know About the Anthropic Settlement — The Authors Guild
- What to Know About the $1.5 Billion Bartz v. Anthropic Settlement — Copyright Alliance
- Anthropic settles with authors in first-of-its-kind AI copyright infringement lawsuit — NPR
- Official Settlement Website
