Photo: Pcess609

EU transparency obligations for AI providers have come into force: This is how rights holders are positioned

Aug 15, 2025 | Analysis, Artificial Intelligence

15. August 2025

Rights holders have criticized the design of the code of practice for providers of AI models for general use and the transparency template for training data, which were published this summer. Here are the challenges we see for copyright enforcement.

Over the summer, the European Commission published the final version of the code of practice for providers of AI models for general use, as well as the final version of a transparency template that these providers must fill out with information about the training material they have used. Finally, guidelines were also issued for understanding selected concepts in the AI Regulation itself. On August 2, the transparency obligations for providers of AI models for general use came into force.

The Danish Rights Alliance and other rights holder organizations have worked intensively to give rights holders a real opportunity to enforce their copyright. We have done this by participating in the working group on the code of practice and submitting comments on a draft transparency template issued by the EU AI Office at the beginning of the year.

Although these efforts have helped to tighten certain obligations, the final code of practice and template are far from what we wanted to ensure effective enforcement of copyright. We therefore agree with the criticism recently expressed by several rights organizations at the international level, as well as at the national level by our members Danske Forlag, Producentforeningen, and Koda, among others.

The transparency template provides limited enforcement options

While the idea behind the code of practice is that signatory AI providers can demonstrate their compliance with the AI Regulation, the hope was that the transparency template would give rights holders the much-needed and, for enforcement purposes, essential insight into AI providers’ training data. Here we summarize the main challenges we face in our work to enforce copyright.

Compliance with obligations expected in 2026 at the earliest

Although the copyright obligations in the AI Regulation have entered into force, we can expect concrete action from AI providers in about a year at the earliest. This is because the AI Office will not begin enforcing violations of the AI Regulation until August 2, 2026, and AI providers who have signed the code of practice have the same deadline to demonstrate compliance with the obligations. In addition, providers of AI models for general use that were placed on the European market before August 2, 2025, will only have to comply with the obligations from August 2, 2027.

History tells us that AI providers only comply with regulations when they are forced to do so through enforcement measures. We can already see that OpenAI has failed to publish training data for its latest GPT-5 model, even though the model was launched on the European market after August 2, 2025.

Insufficient transparency requirements for datasets, crawled domains, and illegal file-sharing services

According to the AI Regulation, providers of AI models for general use must prepare and publish a sufficiently detailed summary of the content used to train the model, in accordance with the AI Office’s template. Unfortunately, we must note that the template does not provide sufficient information for the effective exploitation and enforcement of copyright.

Below, we describe real-world examples where AI providers are not required to disclose sufficient information.

Where does this leave rights holders?

Due to the lack of intervention against AI providers until August 2, 2026, at the earliest, we do not expect any new developments from AI providers for the time being. Even after this date, rights holders will not have sufficient insight into whether their content is being used to train AI models for general use. This applies in particular to small and medium-sized rights holders and rights holders from smaller language areas such as Denmark. 

The low thresholds for data disclosure and the vague requirements for the content of the documentation will mean that Danish rights holders will have limited insight into the use of their content. Our enforcement work cannot therefore be based on the limited knowledge that AI providers are required to make available as a result of the transparency template.

We and other rights holders therefore continue to face the daunting task of documenting which copyright-protected content is used to train AI models for general use, which is crucial to effectively exploiting and enforcing one’s copyrights.

The Commission will regularly assess whether there is a need to revise the transparency template, including in light of practical experience and technological developments. The Commission may choose to revise the template before its enforcement powers take effect on August 2, 2026.

The Danish Rights Alliance will therefore continue to do what we can to illustrate the significant challenges to effective copyright enforcement.