Generative Artificial Intelligence (AI) : Overview, risks, and regulation

Generative Artificial Intelligence (AI) is a set of algorithms trained on large volumes of data to generate output like images, text, music and videos. A popular example of this is ChatGPT which is a large language model (LLM) developed by OpenAI.

Other generative AI also include DALL-E which generates art from text descriptions. Generative AI has hit us by storm and fuels the AI race which prompted AI chatbot developments such as Google’s release of Bard and Baidu’s release of Ernie. This article aims to explore the issues and regulation around the application of generative AI.

Generative AI Case Study : Overview, issues and regulation

1. ChatGPT

The more well-known example of generative AI is ChatGPT which is a LLM trained on large datasets to produce output like text, images, audio and video. While it may prove to be useful tool for efficiency and productivity, there are risks of widespread misinformation. Even ChatGPT recognised it has limitations, citing “there’s currently no source of truth”. Such misinformation can prove to be dangerous as evidenced by the short lived Galactica, a LLM developed by Meta which shut down after three days as it was capable of generating incorrect, racist and dangerous information. Tweets reported that the Galactica was capable of generating Wikipedia entries on the benefits of suicide and on the benefits of being white.

Such generative AI also raises questions surrounding intellectual property rights. For example, who owns the intellectual property rights to output generated by ChatGPT? We addressed this question directly to ChatGPT and the answer was as follows: “I do not own the content that I generate. I am a machine learning model developed and owned by OpenAI, and the content generated by me is subject to OpenAI's license and terms of use”.

Such AI chatbots may fall within the scope of the Online Safety Bill regulation. Lord Parkinson, a junior minister in the Department for Culture, Media and Sport stated that “Content generated by artificial intelligence ‘bots’ is in scope of the Bill, where it interacts with user-generated content, such as on Twitter. Search services using AI-powered features will also be in scope of the search duties outlined in the Bill.”

It has been reported that the EU parliament is considering compromise amendments to the EU AI Act (“Act”) to accommodate for generative AI like ChatGPT. It remains to be seen whether the Act will class such generative AI as “high risk” and whether it warrants any pre-market or post-market obligations.

2. AI Generated Comic Art : Zarya of the Dawn

In September 2022, the U.S. Copyright Office (USCO) issued registration of a comic book, “Zarya of the Dawn” by Kris Kashtanova. The comic used images generated by an AI software called Midjourney. Shortly after granting registration, USCO retracted its decision and notified Kris Kashtanova that it may cancel registration. In February 2023, USCO reissued the registration to only cover the text and arrangements in the comic book but not to the AI generated images.

To clarify its position on AI generated works, USCO published a position statement stating that there must be “some element of human creativity” for work to be copyrightable. This means USCO will assess whether there is human authorship when deciding whether to grant registration or not.

This brings us to question what does human authorship mean? USCO states that it will consider whether work was created “with the computer [or other device] merely being an assisting instrument, or whether the traditional elements of authorship in the work (literary, artistic, or musical expression or elements of selection, arrangement, etc.) were actually conceived and executed not by man but by a machine”. The answer to this will depend on the how the AI tool was devised to create the work. Whether there is human authorship in a work will therefore be decided on a case by case basis.

Unlike other countries, the UK currently grants copyrights to computer generated work if the work is an original literary, dramatic, musical, or artistic work. Section 178 of the Copyright, Designs and Patents Act 1988 defines “computer generated work” as work “generated by computer in circumstances such that there is no human author of the work”. The United Kingdom Intellectual Property Office (UKIPO) is currently assessing responses to its consultation on whether copyright protection for such works should remain; the copyright position for text and data mining; and patent protection for AI-devised inventions.

3. Non-Fungible Tokens (NFT)

NFT is a unique emerging digital asset where ownership data is recorded in a blockchain. Each NFT has its own unique set of identification codes and metadata. NFTs have been around for a few years now but it is gaining more traction now as we see integration of AI into NFTs. A NFT artwork by a robot named Sophia sold for nearly USD700,000.

An example of an AI tool that generated NFTs is StarryAI which uses AI to generate NFT art based on word prompts provided to it. Such application of AI in NFTs raises issues around intellectual property rights - namely, who owns the intellectual property rights, especially where the AI software has more involvement in generating the NFT artwork.

Current intellectual property laws provide for NFTs. For example, NFTs are now covered in 12th Edition of the Nice Classification, introduced in Class 9 as “downloadable digital files authenticated by NFTs”. However, the copyright ownership regime for NFTs generated by AI still remains unclear. Would the ownership lie with the author of the text prompts or the developer of the AI tool? It is also questionable whether the human authorship requirement as mentioned above would assist with this uncertainty.

Key takeaways

Majority of the issues around generative AI include but not limited to intellectual property rights, liabilities and confidentiality obligations.

  • The intellectual property rights protection regime differs across jurisdictions and therefore, consideration needs to be given to any conflicting intellectual property protection regimes. This is further compounded by the fact that there is no comprehensive international treaty to give uniformity on the intellectual property regime for such emerging technologies. For reasons set out above, the position around ownership of intellectual property rights for generative AI remains blurry and clarification is needed. In fact, this is proposed in the UK Budget 2023 (see our article on this here) which states that the government will work with the UKIPO to clarify rules around intellectual property rights. This is much welcomed as clarification around intellectual property rights for generative AI will incentivise innovation in AI.

  • Confidentiality obligations may also be an issue when using generative AI. When entering prompts to LLM like ChatGPT, care needs to be taken to prevent breaching any confidentiality obligations. There is an option to “opt out” from this feature.

  • The law around generative AI is still unclear. While generative AI proves to be of great use to society, it is obvious there are risks that lie within it. Hence, the need for clear guidance and/or regulation to provide an accountability framework for these risks and boost society’s confidence in adopting AI in their everyday lives.

If you have any questions or require advice, please reach out to Paddy Kelly or Carmen Yong in our Corporate & Commercial Department.