This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Skip to main content
United Kingdom | EN-GB

Add a bookmark to get started

| 3 minute read

Latest proposals from the UK Parliament to protect creative industries against the risks posed by AI

At the end of August, the Culture, Media and Sport Committee ("Committee") published a report called "Connected tech: AI and creative technology" that explores the impact of the development of AI on the UK's creative industries. The Committee is appointed by the House of Commons to scrutinise the spending and policy decisions of DCMS and its associated public bodies. 

Under the report, the Committee has made positive recommendations in favour of creators following a prior proposal made by the Intellectual Property Office (IPO) to change UK law in a way that would have diminished (or even eliminated) protections for rights holders in furtherance of the use and development of AI. 

So why did the Committee publish this report? What recommendations has it made to the UK Government? And how does this play into the regulation of AI in the creative industries? 

The backdrop to the Committee's report is the rapid technological advances in AI and, more specifically, generative AI (being AI that, according to the report, "generate[s] images, text and other types of media in response to prompts (such as ChatGPT, DALL-E and Midjourney)"). 

The development of generative AI systems relies heavily on the processing of vast datasets to train their machine learning models. Machine learning is often posited as the most exciting aspect of AI - algorithmic solutions that "learn" from data and continuously improve their outputs over time, without the need for direct human involvement. The more comprehensive the datasets used to develop the AI, the better the "inferences, predictions, decisions, recommendations" and, in the context of generative AI, content is made by the AI. However, to develop systems that create great content, developers often need to copy existing works that are copyrighted works under the Copyright, Designs and Patents Act 1988 ("CDPA") and which cannot be exploited for text and data mining (TDM) purposes, unless permitted under licence or exemption. Under the CDPA, the only exemption to copyright protection is where TDM is conducted for "non-commercial research purposes" (Section 29A of the CDPA). 

However, in June 2022 the IPO proposed to change this long established position by introducing "a new copyright and database exception which allows TDM for any purpose". If enacted, that approach would have enabled AI developers to pursue TDM on copyrighted works in order to create new and competing outputs through an AI and to exploit them commercially. The move to allow commercial TDM understandably caused uproar within the creative industries. To use one example, Jamie Njoku-Goodwin, CEO of the British music industry's trade body UK Music, described the plans as a "green light to music laundering...[taking] music they do not own, use copies of it to train an AI, and then reap the commercial rewards with a legally 'clean' new song". 

So where do things stand now?  According to the Committee's report, the UK Government has responded positively to address the ongoing concerns of the creative industries on the use of copyrighted works in the development of AI. The Committee has, among other things, recommended that plans to pursue a broad exemption to copyright for TDM purposes are dropped. This would no doubt be a positive development for the creative industries. 

However, the report does raise questions about how best to regulate the use of AI within the creative industries. Where should the lines be drawn between intellectual property law and generative AI? And how do we ensure that creators are adequately protected whilst encouraging innovators to keep inventing? In the UK, we are awaiting the publication of a new code of practice on AI and copyright that will purportedly aim to strike a balance between increasing the availability of licences for data mining and enshrining protections for rights holders. And yet any new standards will apply within the UK only. How will AI regulatory frameworks be harmonised across the world to protect creators and foster innovation? Will the AI Safety Summit in November result in coordinated action being agreed upon by global political and business leaders?  What is clear is that we are at a pivotal moment in AI development, and policy makers have a critical role to play in ensuring that domestic and global regulatory frameworks are fit for purpose. 

We recommend that the Government does not pursue plans for a broad text and data mining exemption to copyright. Instead, the Government should proactively support small AI developers in particular, who may find difficulties in acquiring licences, by reviewing how licensing schemes can be introduced for technical material and how mutually-beneficial arrangements can be struck with rights management organisations and creative industries trade bodies. The Government should support the continuance of a strong copyright regime in the UK and be clear that licences are required to use copyrighted content in AI. In line with our previous work, this Committee also believes that the Government should act to ensure that creators are well rewarded in the copyright regime.

Tags

ai, artificial intelligence, generative ai, machine learning, media, advertising, video games, tv, music