AI: creative industries grapple with technology’s implications for their IP

Neil Hodge Friday 29 September 2023

Whether artificial intelligence (AI) can produce images, music, literature or other content better than a human will always be a moot point. It’s the realisation that the technology can produce it faster, cheaper and by using other people’s work that legislators must contend with – and soon.

Authors, musicians, actors, film-makers and others working in creative industries have woken up to the threat that generative AI – a type of AI program that creates content from a data set – poses to their work. In the US in September, a group of 17 prominent authors including George R R Martin, Jodi Picoult and John Grisham sued tech company OpenAI, which owns large language model-based chatbot ChatGPT, for what they say is ‘systematic theft on a mass scale.’ The class action lawsuit claims that the company is illegally using their copyrighted work and that its chatbot could produce new work in the style of themselves and other authors without their consent and without any of the profit.

An OpenAI spokesperson said that the company respects ‘the rights of writers and authors, and believes they should benefit from AI technology.’ The spokesperson added that the company was having ‘productive conversations with many creators around the world, including the Authors Guild, and have been working cooperatively to understand and discuss their concerns about AI.’

Meanwhile, Hollywood writers ended their near five-month strike at the end of September. The industrial action was taken due to fears that film studios’ use of AI will lead to job and pay cuts. The Writers Guild of America’s 11,500 members will soon vote on whether to approve a three-year deal that offers pay raises and protections around AI use. The Screen Actors Guild is still seeking an agreement, having launched a parallel strike in July.

Legislators are very cautious not to hamper innovation and to not end up with regulations that would stop the next ‘spinning jenny’

Johan Hübner
Chair, IBA AI and Robotics Subcommittee

Legal commentators believe there’s bound to be an increase in legal disputes surrounding copyright and intellectual property (IP) infringements. Daniela De Pasquale, Chair of the IBA User Generated Content Subcommittee and a partner at Italian law firm Ughi e Nunziante, highlights that as a result of how user generated AI is trained on the internet and the significant amount of protected works available online, ‘we are going to see a lot of litigation to resolve disputes because legislation will always be behind the speed at which the technology evolves.’

De Pasquale says the first way to protect content creators is to have legislation in place that works, even though current copyright laws are still ‘a very resilient tool’. A secondary method involves implementing contractual obligations to set out the limitations of how AI will be used, the extent of its use and what data is used in the process. The third option to consider is ‘soft law’ and the adoption of company guidelines, whereby the AI industry sets up fair practices and internal codes aimed at avoiding the unauthorised use of third-party content, she says.

At the centre of these measures is the need for more transparency about what data is being used to train AI systems and whether content creators have been informed and/or credited and remunerated for its use. De Pasquale believes the EU’s incoming AI Act should help on both counts as it’ll require detailed summaries of the copyrighted data used for the development of works to be made public.

Under the EU AI Act, companies deploying generative AI tools, such as ChatGPT or image generator Midjourney, will also have to disclose any copyrighted material used to develop their systems under a recently added transparency requirement. ‘The capability of AI to imitate language patterns, sounds, voices, images and music means that the rights of creative artists need to be protected,’ says De Pasquale. ‘The EU’s AI Act should provide artists and content creators with at least more information about how and where their work is being used, although it will still remain with the artists to assert their rights under copyright laws until content licences develop in this sector.’

Companies generally using – rather than developing – AI technology could also be at legal risk if the tools they’ve employed in the creation of a new product or service have used copyrighted material in its design. ‘Companies should not assume that the use of the AI, or its outputs, are risk free,’ says Diego Black, a UK-based partner at IP firm Withers & Rogers. ‘The availability of a product does not mean that the output can be created without consequence. For instance, if the AI is used to help design products, companies cannot be certain that the design would not infringe existing IP rights, such as patents.’ He adds that, as things stand, there are a number of unanswered questions around use and ownership, ‘so it is important that companies consider the liability associated with using such products.’

Legislating to protect copyright while also promoting AI development is set to remain problematic. While the EU hopes its transparency requirement will make tech developers think twice about the data they use to train AI systems, other jurisdictions are unsure about how to implement rules to regulate technology that’s evolved beyond what most legislators conceived it could do. Japan is just one country that must address the problem of whether to tighten rules after purposefully leaving them lax in order to stimulate AI development.

Takashi Nakazaki, Chair of the IBA Disputes and Rights Subcommittee and special counsel at law firm Anderson Mori & Tomotsune in Tokyo, says Article 30-4 of Japan’s Copyright Act stipulates that, in principle, third-party copyrighted works may be freely used for the purpose of machine learning to improve AI, including generative AI, without the right holder’s permission. This had led to prominent Japanese copyright jurists describing Japan as ‘a machine learning paradise’.

Nakazaki says the provision was introduced as an amendment to the legislation in 2018 and adds that ‘there was no notable opposition from the content industry at the time’. However, since the global boom in generative AI, content creators have recognised that the technology’s widespread use will probably have a significant impact on their business, and several industry associations have urged the Japanese government to narrow the situations in which Article 30-4 of the Copyright Act applies. The government is considering what – if any – changes might be necessary as it weighs up how enhancing copyright protection may affect the country’s AI development ambitions.

Johan Hübner, Chair of the IBA AI and Robotics Subcommittee and a partner at law firm Advokatfirman Delphi in Stockholm, sums up the problem succinctly when he says that ‘legislators are very cautious not to hamper innovation and to not end up with regulations that would stop the next “spinning jenny”’.

Image credit: top images/AdobeStock.com