The rapid development of generative AI has caused grave concern in Europe’s creative sector. On Tuesday, Members of the European Parliament sounded the alarm, adopting a report that calls for tighter rules on how AI companies use copyrighted material.

The so-called own-initiative report, adopted by 460 MEPs with 71 voting against, contains a series of recommendations to protect copyrighted creative work from use by artificial intelligence. According to the report, the breakneck speed of technological advancement in AI “creates huge legal uncertainties for all parties involved”.

The report also took aim at the European Commission, warning the European legislator “through its lengthy procedures and a lack of courage, continues to refuse to tackle the crucial issues head on, the EU and its actors will always be left at a disadvantage and further dependencies will be created”.

The report insists that use of copyrighted material by generative AI must be fairly remunerated and that ways must be explored to allow compensation for past use—but not through a global licence for providers to train their systems in exchange for a flat-rate payment.

You might be interested

Rewriting the rules

MEPs also want AI providers to publish an itemised list of all copyrighted works used to train their models, along with detailed records of crawling activities. Failure to do so could be treated as copyright infringement, triggering legal consequences.

Among the other recommendations is that content fully generated by AI should not be protected by copyright, and that the European Union Intellectual Property Office (EUIPO) should maintain an opt-out register for rightsholders who wish to exclude their work from AI training.

However, some criticised the report, arguing it creates needless complexity when there is already a Copyright Directive and AI Act in place. In particular, the Copyright Directive includes a text-and-data-mining (TDM) exception that allows developers to train their models on publicly available material unless rightsholders specifically opt out.

Mixed reactions

“A right they actively exercise today,” said Boniface de Champris, AI Policy Lead at CCIA Europe, adding that the report sends the wrong signal to innovators and could hold back Europe’s digital competitiveness on the global stage. 

“This risks generating fresh uncertainty, as EU rules already strike a careful balance. The report suggests requiring prior authorisation or broad licensing regimes, which would create new complexity and legal uncertainty,” Mr de Champris said.

MEP Axel Voss (EPP/DEU), the author of the report, said the text was all about clear rules. “Legal certainty would let AI developers know which content can be used and how licences can be obtained. On the other hand, rightsholders would be protected against unauthorised use of their content and receive remuneration. If we want to promote and develop AI in Europe while also protecting our creators, then these provisions are absolutely indispensable,” Mr Voss said.

His words were echoed by shadow rapporteur Tiemo Wölken (S&D/DEU). “This report is an important step towards establishing an AI licensing market that benefits rightsholders and AI companies. We must ensure that smaller creators, in particular, have the tools they need to defend their rights. Equally, European AI companies deserve legal certainty to make AI training in Europe viable,” Mr Wölken said.

The public interest case

Justus Dreyling, Policy Director of COMMUNIA, was less effusive, but nonetheless glad that the report “at least implicitly acknowledges that the existing TDM exceptions apply to AI training—earlier drafts had called this into question”.

Paul Keller, Director of Policy at Open Future, said the real challenge was not to replicate the TDM exemption “but to address what it leaves out—primarily fair remuneration and meaningful transparency. The adopted report moves on both of those fronts, though less decisively than the political moment requires.”

“Equally important is the positive framing of the space created for public-interest AI development under Article 3 of the Copyright Directive. Recommendation 4 calls on the Commission to ensure that activities conducted for scientific research or educational purposes—specifically by research organisations and cultural heritage institutions, and in the framework of non-commercial innovation—are not restricted. This is a meaningful acknowledgement that the report’s framework is not intended to sweep away existing protections for public-interest uses,” Mr Keller said.

Protect the news

Beyond the not-for-profit sector, the report also calls for added protection for news media.

The press and news media sector’s work is regularly exploited by AI systems, the report says. News media outlets whose traffic and revenues are diverted by AI systems should be fully compensated and they should also have the right to refuse use of their content for training AI systems. MEPs insist that the aggregation of news content must ensure media pluralism and diversity of information, avoiding the selective processing of information or self-preferencing practices by gatekeepers benefiting their AI services.

Mr Dreyling said he was disappointed with the final text. “The report contains several problematic provisions. In particular, it includes overly broad language on protecting press publishers’ and media content, which could be interpreted as excluding such content from being used for AI training and placing it above other copyrighted works,” he said.

Unsurprisingly, the Society of Audiovisual Authors welcomed the vote. “Today’s vote confirms what Europe’s screenwriters and directors have been saying for years: the current framework is failing them. GenAI companies have built billion-euro businesses on the works of audiovisual authors without asking, paying, or disclosure. The Parliament has now spoken with a clear majority. We call on the European Commission to swiftly introduce enforceable obligations that level the playing field,” said SAA chair Barbara Hayes.

Geopolitical stakes

European sovereignty is once again front and centre. There are the usual concerns about the dominance of large US tech companies over the European creative sector—yet any new rules that hamper European AI development risk allowing those same companies to surge even further ahead, creating a difficult balancing act for Brussels.

“It is also not entirely clear what the European Commission should take away from this report. Own-initiative reports do not oblige the Commission to take legislative action, although they do carry political significance. At the beginning of this process, many expected the report to focus primarily on questions of remuneration. However, the final outcome doesn’t offer a particularly clear sense of direction in this regard,” Mr Dreyling concluded.