Tony Blair’s Institute Publishes AI Report Advocating for Overhaul of UK Copyright Law: Key Insights Explained.

MBW Reacts is a series of analytical commentaries from Music Business Worldwide written in response to major recent entertainment events or news stories. Only MBW+ subscribers have unlimited access to these articles.


Impact of AI on Music Copyright

An institute led by Tony Blair, the former Prime Minister of the UK, has released a report with recommendations that could significantly alter the handling of music copyright in the AI era.

The report, titled Rebooting Copyright: How the UK Can Be a Global Leader in the Arts and AI, presents a “progressive solution” that seems to favor AI progression at the expense of established creators’ rights.

The authors of the report make their stance clear regarding its inclination towards large tech and AI companies, openly stating that “the progressive solution is not about clinging to copyright laws designed for a bygone era but rather about allowing them to evolve alongside technological advancements.”

Written by Jakob Mökander, Amanda Brock, Mick Grierson, Kevin Luca, Zandermann, and Joseph Bradley, with a foreword by Professor Fernando Garibay of the Garibay Institute, the report asserts that “the UK can accommodate both leading-edge AI innovation and a thriving creative sector.”

However, its recommendations tend to align with the interests of AI developers, stating that “there are more effective methods for supporting creative industries in the digital landscape than restrictive copyright laws for AI-model training.”

Established by the former Prime Minister, the Tony Blair Institute positions itself as a non-partisan think tank aiming to devise “practical solutions” for intricate challenges, particularly in technology policy.

Yet, music rightsholders reviewing this report may find little solace in its outlook on the future of copyright.

Concerns for Music Rightsholders

At the heart of the report’s recommendations is support for the government’s proposal for a text and data mining (TDM) exception with an opt-out mechanism, which would drastically alter how music rights function:

“This would allow the training of AI models on publicly available data for any purpose, while providing rights holders more control over their preferences in regard to AI training,” the report claims.

In practical terms, this means AI companies could utilize music and other content freely unless creators actively choose to opt-out.

The report indicates that the TDM exception is already aligned with EU policy, and it has backing from the Prime Minister through the AI Opportunities Action Plan (AIOP).

The report, available for review here, follows the stance of ChatGPT creator OpenAI, which has called for substantial revisions to copyright laws in the U.S. to enable AI companies to utilize copyrighted works without obtaining permission or compensating rightsholders.

OpenAI articulated these viewpoints in its response to the Trump administration’s request for information regarding a national AI Action Plan.

Both OpenAI and Google submitted comprehensive policy frameworks that could profoundly affect music rightsholders and other content creators.

Sir Paul McCartney, Paul Simon, and Bette Midler were among hundreds of Hollywood figures signing a letter opposing these proposals.

The report from the Tony Blair Institute arrives amid ongoing legal actions from music rightsholders against AI companies like OpenAI and Anthropic, alongside AI music generators Udio and Suno.

Here are five key aspects of the report that may raise concerns for music rightsholders:

See also  Irv Gottie, Founder of Minder Inc., Passes Away at 54

1. The report’s suggested ‘AI-Preferences Standards’ may not adequately protect music copyrights

The first recommendation of the report advocates for the enhancement of “AI-preferences standards” for rightsholders.

According to the report, “the UK government should support internationally harmonised AI-preferences standards developed for an effective opt-out regime.”

It adds: “These standards must surpass the limitations of robots.txt (files utilized to manage web crawler access), granting rights holders greater control over the use of their content while incorporating pragmatic commitments from developers to trace and respect these preferences.”

“The UK government should support internationally harmonised AI-preferences standards developed for an effective opt-out regime.”

While recognizing that existing systems like robots.txt are inadequate, the report’s solution still requires music rightsholders to actively block their works from being used, rather than mandating permission prior to use.

This represents a fundamental shift in how copyright has traditionally functioned. The report supports a “tools-not-rules” approach that favors technical methods over legal enforcement.

As highlighted in the report, “open-source tools can play a vital role in implementing these standards while fostering innovation. This was suggested for security management in AI at the recent AI Action Summit, through the launch of ROOST.”

For music rights holders, who have traditionally depended on clear legal frameworks for safeguarding their work, the transition to technical opt-out measures signifies considerable risk. The report recognizes essential challenges with this approach:

“The fundamental issue with opt-outs is the internet’s reliance on unique resource locators (URLs), while copyrighted works do not adhere to that structure. For instance, a recording of a song involves rights related to songwriting, performance, recording, and more.”


2. Flaws in the report’s ‘Multi-Pillar Transparency Approach’

The second recommendation of the report promotes a multi-dimensional “transparency approach,” emphasizing that “the UK government should implement policies incorporating practical disclosures from AI developers, attributional transparency, and private regulatory scrutiny.”

For music rightsholders, the ambiguous commitment to “pragmatic disclosures” falls short of offering the detailed information necessary to effectively monitor and manage the utilization of their works in AI frameworks.

The report acknowledges the challenges of transparency approaches, recognizing the complications of providing comprehensive URL-level disclosures. According to the report, “maintaining a database containing tens of billions of URLs could be financially burdensome for smaller AI developers and could exacerbate market concentration.”

What raises concern for the music industry is that this rationale places convenience for AI developers over the necessity for rightsholders to have precise knowledge of the works being used during training for licensing purposes.

The report also mentions “attributional transparency,” which depends on tools that can identify training data from AI systems after the fact:

“Attribution tools will enable outsiders to discern training data in the output generated by AI work,” the report states. “These tools will improve and new ones will emerge as the industry evolves, allowing rightsholders who suspect developers are misusing their materials to investigate and potentially litigate.”

Transparency regarding AI training data is a critical matter for the global music industry. In the United States, the RIAA, NMPA, and several music organizations recently suggested in their joint AI Action Plan submission that AI companies maintain detailed logs of training materials and provide reasonable summaries of the works used in AI model development.

See also  Capital of Mourning: Support for Teachers Facing Redundancy in Bimm Dublin, the Original Venue of Fontaines DC

They also endorse the proposed TRAIN Act, which would establish a court-managed process for copyright holders to inquire into potential unauthorized use of their works.


3. A concerning suggestion for a “one-off exception” to license decades worth of content

The third recommendation of the report emphasizes the need to set standards for AI creativity.

According to the report, “To protect the creative industries, clear standards must be established regarding creativity and licensing in AI applications. The UK government should introduce a one-off exception permitting major rights holders to license content from the past 75 years for AI training, as advised in the AIOP.”

“The UK government should introduce a one-off exception allowing major rights holders to license the past 75 years of content for AI training, as recommended in the AIOP.”

This proposal for a “one-off exception” aimed at licensing decades of content raises serious issues for music rightsholders.

The report recognizes the intricacies of music rights, illustrating that “the UK possesses a vast heritage of art and media that is not accessible on the open internet” and that “this body of work is valued in the billions but is entangled in complex rights issues. Each piece may involve numerous rights holders, each asserting intricate partial rights.”

Nonetheless, despite recognizing these complexities, the report suggests circumventing standard licensing protocols:

“Only governments can unlock archived content by granting a one-off exception to distribution that allows rights holders to relicense archived work for AI training without requiring explicit permission from all relevant rights holders,” the report declares.

For the music industry, especially for rightsholders interested in historical recordings, this poses a potential government-sanctioned override of their established rights.


4. Proposal for a “Centre for AI and Creative Industries” funded by consumers

The final recommendation in the report discusses supporting the creative sector in transitioning to the AI landscape:

According to the report, “the UK government should take a proactive stance in aiding the creative sector’s adaptation to the AI era.” It adds that “this can be realized through targeted funding and by establishing a new Centre for AI and Creative Industries (CACI).”

To finance this new Centre for AI and Creative Industries, the authors propose that consumers should bear a tax known as an “ISP Levy.”

“As this tax is on consumers, there are no direct implications for technology firms.”

The authors state: “The goal of the levy would be to support the transition of the creative industries into the generative-AI era in a socially progressive manner and to acknowledge the presence of bad actors in the scraping domain. It wouldn’t need to raise substantial amounts to achieve its objectives.

“Using the assumption of 116.1 million mobile-data subscribers (including machine-to-machine) and 28.5 million broadband subscribers, with average monthly fees of £20 and £50 respectively, a tax rate of 0.1 percent would produce total revenue of almost £45 million. To strive for a target revenue of £200 million, the tax rate may be raised to 0.44 percent, resulting in consumers paying only roughly 31p extra each month.”

See also  Tencent Music Surpasses $2 Billion Annually in Music Streaming Subscriptions

The report reassures that the tax being on consumers means there will be “no direct impacts on technology companies, apart from the minor revenue decrease implied by the small income effect and intermediate effects on ISPs.”

It notes that this “remuneration could also be distributed to those artists who opt not to withdraw, as a reward for their contribution to AI development.”

However, the calculations raised in the report prompt questions about whether such compensation would adequately represent the value of music utilized by AI systems.

For perspective, global recorded music revenue alone, according to recent figures from IFPI, reached USD $29.6 billion in 2024.


5. The report equates AI training with human learning.

A central argument in the report supporting its stance on AI model training draws a comparison between AI training and human learning.

The report asserts: “To argue that commercial AI models cannot learn from open content on the web would be akin to asserting that knowledge workers cannot benefit from insights they gain from reading the same content.”

“To argue that commercial AI models cannot learn from open content on the web would be close to arguing that knowledge workers cannot profit from insights they get when reading the same content.”

The report further elaborates: “There are also superior methods to assist creative industries in the digital age than through restrictive copyright laws for AI-model training. The question is not whether generative AI will reshape creative industries (it’s already doing so) but how to ensure this transition benefits all stakeholders equitably.

“AI is already being integrated into creative workflows, streamlining routine tasks while fostering new forms of expression. Additionally, the economic impact will fluctuate across sectors and among individuals. Instead of striving to maintain outdated regulations from the 20th century, rights holders and policymakers should concentrate on cultivating a future where creativity is acknowledged and respected alongside AI innovation.”

“Was Tracey Emin expected to reimburse Louise Bourgeois for the transformative experience she had upon encountering her work at the Tate in 1995?”

The report goes on to discuss its human-AI learning comparison in further detail, providing the following example to bolster its argument:

“For instance, many individuals in the knowledge economy read news articles,” the authors state.

“They subsequently sell their general knowledge, which includes insights gleaned from reading these articles, as part of their work. They are not obligated to provide additional payment to the newspaper beyond what they may have already paid to access it.

“While they might cite the paper, this is not mandated by law. In a similar vein, artists often visit galleries without an entrance fee, exploring a variety of creative works.

“Was Tracey Emin expected to reimburse Louise Bourgeois for the transformative experience she had upon encountering her work at the Tate in 1995? The relationship between originality and imitation has always been ambiguous, historically spanning classical art to the present day.”

Tony Blair’s institute just published an AI report calling for the UK to rip up copyright law. Here’s what it says…