California's proposed "AI Copyright Transparency Act" (Assembly Bill 412) aims to solve a legitimate problem with an impossible solution. And that's not hyperbole, the bill's requirements are literally impossible to implement.
Let me walk through what this bill does and why both AI startups and tech giants should be concerned.
What A.B. 412 actually requires
The bill requires AI companies to:
- Track and document every copyrighted work used to train their models
- Create a public portal where copyright owners can ask if their work was used
- Respond within 7 days with a "comprehensive list" of how their works were used
- Pay $1,000 per day for each violation if they fail to comply
- Keep these records for the life of the model plus 10 years
This sounds reasonable only if you've never built an AI system. Modern AI models are trained on billions of documents and images. No one, not even the biggest tech companies, has the capability to track every piece of content at that scale.
Burden of proof
The real problem is who gets hurt by this bill. Big companies like Google and Microsoft won't shut down - they'll hire more lawyers and compliance people. It might cost them millions, but they can absorb it.
The startups are the ones who will get crushed. Imagine being a 10-person AI company and suddenly needing to:
- Build a complex data provenance system from scratch
- Create and staff a public request portal
- Respond to potentially thousands of inquiries within strict deadlines
- Face the constant threat of $1,000/day penalties for mistakes
This is existentially threatening for small companies. And that's before we even get to the legal fees when the lawsuits start.
The legal mess
Speaking of lawsuits, A.B. 412 creates a perfect environment for them. The bill essentially creates a new California-specific right for copyright holders that likely conflicts with federal copyright law.
Copyright law is deliberately federal to prevent exactly this kind of state-by-state patchwork. A.B. 412 attempts to regulate what's potentially protected as fair use under federal law, setting up a direct constitutional conflict.
Even if you think the bill is a good idea, it's hard to see how it survives legal challenges. But that won't stop the chaos while those challenges work through the courts.
The creator's valid complaint
But here's the thing: the creative professionals backing this bill have a legitimate grievance. Imagine being an artist and discovering an AI can perfectly mimic your style after being trained on your life's work - without your knowledge or consent.
These creators currently have no way to know if their work was used to train systems that might eventually replace them. That's genuinely troubling, and we should take it seriously.
The problem isn't the goal of the bill but its approach. Creators deserve some transparency, but the specific mechanism proposed is unworkable.
A better approach?
If we actually want to help creators without killing AI innovation, we need something more practical:
- Category-level disclosure rather than item-by-item tracking
- Realistic timeframes that acknowledge the technical challenges
- Safe harbors for companies making good-faith compliance efforts
- Graduated requirements based on company size and resources
Without these kinds of modifications, A.B. 412 would mainly succeed at moving AI development out of California.
What happens next
This bill is still moving through the California legislature. It's passed one committee and is pending in another. Content creators are pushing hard for it, while tech companies and civil liberties groups are fighting back.
If California passes this bill in its current form, expect immediate legal challenges. Meanwhile, companies would face an impossible choice between compliance attempts that drain resources or risking substantial penalties.
The more likely outcome is that federal lawmakers will eventually step in. The collision between AI and copyright is too important to leave to a patchwork of conflicting state laws.
A.B. 412 tries to solve a real problem with an impossible solution. The future of AI needs to balance creator rights with innovation, but this bill gets that balance wrong.
What we need instead is a practical approach to transparency that creators can use without crushing smaller companies. There's still time to fix this bill before it becomes law, but the clock is ticking.