Skip to content
Home » Copyright Ruling on AI Training

Copyright Ruling on AI Training

A New Battlefront for Creators

A silhouetted musician plays guitar while glowing music notes are siphoned off into digital lines, representing anger over stolen creativity. The image highlights the emotional impact of the copyright ruling on AI training.

I think this image highlights the intense emotional impact of the recent copyright ruling on AI training.

An angry musician plays guitar while his music is syphoned off into the digital vaults of big tech companies who are literally STEALING his work. No wonder he looks ANGRY.

Why Are Musicians Getting Angry?

The recent AI Copyright Ruling in a US court may have made headlines for its legal complexity, but for many musicians, authors, and creatives in the UK, it hit somewhere deeper: the heart. For them, this isn’t just about licensing or datasets — it’s about theft. Musicians say their work has been quite literally stolen from them.

In this post I hope to clarify what the recent copyright ruling on AI training means for UK musicians and creators — and why emotions, not just legal arguments, are driving the fight. This is the fourth part of my ever expanding series on AI Safety. You may also find parts 2 and 3 of interest as they also deal with copyright issues. Links are below.

What was the Copyright Ruling on AI Training

In June 2025, a California judge ruled that AI company Anthropic infringed copyright by downloading millions of pirated books to train its Claude language model. These works were pulled from unauthorised sources like LibGen and Books3 — and Anthropic admitted it knew they were pirated.

The court didn’t side entirely with the authors — in fact, the judge also ruled that AI training on copyrighted material could be legal under fair use, provided the materials are lawfully obtained and used in a transformative way. That part of the decision was a win for the AI industry.

A Divided Decision

The ruling is important because it’s the first to declare that training AI models on copyrighted works might not be illegal — at least in the US, and at least under certain conditions.

But it also came with consequences. Anthropic now faces the prospect of hundreds of millions in damages, and possibly more. That part of the case will proceed to trial.

What Does the Copyright Ruling Mean in the UK?

This is where things get complicated — and where UK readers may need to read between the lines.

Unlike the US, the UK does not have a general “fair use” doctrine. Instead, we rely on fair dealing, which is far more limited. You can copy works for research, news, or parody — but not for massive-scale model training. That means the legal defence that saved Anthropic in part, wouldn’t necessarily apply here.

And while there’s been no UK court test yet, the precedent is starting to form in public opinion — and in Parliament. That matters.

Creative Anger and Old Wounds

If you’re a musician in the UK, this might feel uncomfortably familiar. Think back to the early 2000s, when streaming exploded and album sales collapsed. At first, the industry didn’t know how to respond. Artists lost income. Then slowly, models emerged — Spotify, Bandcamp, Apple Music — and new paths to revenue were created.

With AI, we may be entering another one of those moments. But this time, it’s happening faster — and often without consent.

They didn’t ask. We didn’t agree. But the scraping began.

What Happens Next?

This ruling leaves three questions hanging:

  • Will UK or EU courts follow the US lead?
  • What counts as “lawfully obtained”?
  • Is fair use (or fair dealing) enough to protect creators — or does it need reform?

For UK creatives, this is the time to raise your voice — not just to complain, but to organise. Whether you’re a songwriter, novelist, visual artist or journalist, the next stage of this conversation will need you.

Sources and Further Reading

Tags:

Leave a Reply

Your email address will not be published. Required fields are marked *