Artificial intelligens and copyright

by | Aug 13, 2025

Photo: Still from campaign video

The director of the Rights Alliance, Maria Fredenslund, participates in the Ministry of Culture’s expert group on copyright and artificial intelligence. Here, the Rights Alliance contributes to describing the challenges that artificial intelligence poses for copyright, as well as providing recommendations for possible solutions.

The expert group consists of selected representatives from rights-holder organizations, industry associations, and technical experts, who will work until the winter of 2024/2025.

Read the terms of reference for the expert group here

“If creative content is to be used for training artificial intelligence, this must—unlike the history with file-sharing services—be done with respect for the actors who enrich the world with art and valuable training data. This requires dispelling the myth that copyright hinders technological progress and individual freedom. The experience from combating illegal file sharing shows that this is a viewpoint that only benefits those who profit from training AI services with unlawful content.”

Maria Fredenslund
Director of The Rights Alliance

The Rights Alliance works to protect copyright in connection with the development and use of artificial intelligence. Machines’ ability to act in a human-like manner depends on their access to human capabilities and inner life. For generative artificial intelligence to write flawlessly, produce realistic videos, and speak with empathy, the underlying AI model must be trained on large amounts of high-quality data.

This has led to a fundamental shift in the Rights Alliance’s efforts to protect Danish content. Literature, media content, music, film, and images have suddenly become key elements in the business models behind generative artificial intelligence, developed by the world’s leading tech companies.

The Rights Alliance has taken a leading role in ensuring that the development and use of artificial intelligence for creative content happens with respect for rights holders. We investigate and report the use of content in datasets that infringe on our members’ rights. We make visible how artificial intelligence challenges copyright. Furthermore, we focus politically on how artificial intelligence should be regulated so that rights holders’ content is protected, and the business foundation of the creative industries is preserved.

Below, you can read about our most important initiatives to ensure copyright protection against the training and use of artificial intelligence.

 

The Rights Alliance’s Main Focus Areas

 

1. Transparency in AI training

For rights holders to protect and enforce the copyright of their content, they need insight into whether their content has been used to train artificial intelligence.

That is why the Rights Alliance works to ensure sufficient transparency regarding which content AI services are developed on the basis of. This requires that, at the political level—both in the Danish and European context—developers of AI are required to document what their training data consists of and where the content was sourced from.

Transparency about the origin and use of content was crucial when the Rights Alliance succeeded in removing the Books3 training dataset (containing illegal copies of nearly 200,000 books) from corners of the internet. However, an increasing focus on transparency from rights holders has led to AI developers gradually becoming less open. We have also uncovered this in a study of the transparency practices of various AI providers.

 

Read the Rights Alliance’s other recommendations for strengthening copyright in connection with the development and use of artificial intelligence here

2. The right to one’s own image, voice, and person

Creative performers increasingly experience that their face, voice, and identity are copied and misused in content generated with artificial intelligence. This includes, for example, actors, TV hosts, and musicians, who find that their personal characteristics – as well as their professional tools – are exploited without their consent to create new music, voice-overs, or fake images and videos that are spread across the internet.

The Rights Alliance has, among other things, uncovered how actor David Bateson’s iconic voice, best known from the video game Hitman, has been cloned and repeatedly shared on social media.

The misuse of personal characteristics in AI-generated content is a serious violation of the individual concerned. At the same time, the trend seriously undermines the foundation of creative performers’ professional livelihoods.

Codification of personality rights

The Rights Alliance works to ensure that personality rights are codified in the Copyright Act. The aim is to safeguard the individual’s right to their own image, voice, and other characteristics, and to ensure that online platforms are required to remove AI-generated content that violates a person’s personality rights.

In spring 2024, the Minister of Culture announced that the ministry will examine the need to introduce imitation protection into copyright law.

For rights holders to protect and enforce the copyright of their content, they need insight into whether their content has been used to train artificial intelligence.

That is why the Rights Alliance works to ensure sufficient transparency regarding which content AI services are developed on the basis of. This requires that, at the political level—both in the Danish and European context—developers of AI are required to document what their training data consists of and where the content was sourced from.

Transparency about the origin and use of content was crucial when the Rights Alliance succeeded in removing the Books3 training dataset (containing illegal copies of nearly 200,000 books) from corners of the internet. However, an increasing focus on transparency from rights holders has led to AI developers gradually becoming less open. We have also uncovered this in a study of the transparency practices of various AI providers.

3. AI services must prevent users from Uploading Protected Content

Providers of AI services should take responsibility for ensuring that users’ use of their services does not lead to copyright infringement. This means that AI services must comply with obligations under the EU’s Digital Services Act (DSA) in the same way as more traditional online platforms. Thanks to years of effort by, among others, rights holders, online platforms such as Google and Meta now provide tools that can recognize and remove the sharing of content that infringes copyright.

Similarly, AI service providers should be required to ensure that users of their services do not upload protected content without permission. The Rights Alliance is therefore working to ensure that authorities set clear requirements for AI services to implement safeguards in the form of content recognition tools that filter out illegal content.