Report warns training methods for AI models are a “direct threat” to screen sector and sets out series of recommendations 

Generative AI companies have been training their models using 130,000 copyrighted film and TV scripts, largely without the permission of their creators and rightsholders, according to the British Film Institute (BFI). 

A BFI report titled AI in the Screen Sector: Perspectives and Paths Forward says the current “training paradigm” – where AI models are developed using copyrighted material without permission – poses “a direct threat to the economic foundations of the UK screen sector”. 

“As generative models learn the structure and language of screen storytelling – from text, images and video – they can then replicate those structures and create new outputs at a fraction of the cost and expense of the original works,” it says.  

“These learned capabilities can be used to assist human creatives, but AI tools may also be used to compete against the original creators whose work they were trained on.” 

Other sources of AI training data also include YouTube videos and databases of pirated books, according to the report.  

As the government considers what legislation to put in place for its AI growth plan, creatives have been calling for an opt-in regime, which would force AI companies to seek permission and strike licensing deals before using their content.  

The report, which largely aims to outline how the UK creative sectors can thrive in the age of AI, offers a series of recommendations, which are outlined in full below.  

The recommedations include developing a robust licensing framework to address copyright concerns surrounding gen AI; future-proofing the creative workforce with more formal AI training; providing transparent disclosures to audiences when AI has been used in screen content; offering targeted financial support for the UK’s creative technology sector; and investing in accessible tools, training, and funding for independent creators, through the development of ethical AI products. 

Current industry adoptions of AI include the Charismatic consortium, backed by Channel 4 and Aardman Animations, which aims to create an AI prototype and publish research into how AI could support under-represented content creators and established producers to enhance storytelling in film and television. 

The BBC is piloting structured AI initiatives and the BFI National Archive is experimenting with AI for subtitling, metadata generation, and content classification. 

“AI has long been an established part of the screen sector’s creative toolkit,” said Rishi Coupland, the BFI’s director of research and innovation and co-author of the report.  

“Our report comes at a critical time and shows how generative AI presents an inflection point for the sector and, as a sector, we need to act quickly on a number of key strategic fronts. 

“Whilst it offers significant opportunities for the screen sector such as speeding up production workflows, democratising content creation and empowering new voices, it could also erode traditional business models, displace skilled workers, and undermine public trust in screen content.” 

 The BFI recommendations in full 

Set the UK as a world-leading IP licensing market 
The UK is well-positioned to lead in this space, thanks to its ‘gold standard’ copyright regime, a vibrant creative technology ecosystem, and a coalition of creative organisations advocating for fair licensing practices…By formalising IP licensing for AI training and fostering partnerships between rightsholders and AI developers, the UK can protect creative value, incentivise innovation, and establish itself as a hub for ethical and commercially viable AI-supported content production. 

Embed data-driven guidelines to minimise carbon impact of AI 
Generative AI models, particularly large-scale ones, demand significant computational resources, resulting in high energy consumption and associated carbon emissions. Yet the environmental footprint of AI is often obscured from end users in the creative industries. Transparency is a critical first step to addressing AI’s environmental impact. 
“With the screen sector in the vanguard of generative AI uses globally, it is ideally positioned to push the demand for carbon minimisation, and the UK screen sector should lead by example.

Support collaboration to deliver ethical AI products 
“Generative AI tools must align with both industry needs and public values. Many models, tools and platforms have been developed without sufficient input from the screen sector (or, indeed, screen audiences), leading to functionality and outputs that are poorly suited to production workflows or that risk cultural homogenisation and ethical oversights. 
”The UK should look to combine its strengths in AI and humanities research, and its reputation for merging technology and culture, to deliver responsible, ethical AI.” 

Enable more shared knowledge 
Across the UK screen sector, organisations, teams and individuals – especially SMEs and freelancers – lack access to structured intelligence on AI trends, risks, and opportunities. This absence of shared infrastructure for horizon scanning, knowledge exchange, and alignment limits the sector’s ability to respond cohesively to disruption. 
“The BFI has proposed creating an ‘AI observatory’ and ‘tech demonstrator hub’ to address this urgent challenge… and provide hands-on experience of emerging tools and capabilities.

Develop the sector to build complementary skills 
Our research identifies a critical shortfall in AI training provision: AI education in the UK screen sector is currently more ‘informal’ than ‘formal’, and many workers – particularly freelancers – lack access to resources that would support them to develop skills complementary to AI. 
However, the UK is well-positioned to lead in AI upskilling due to its strong base of AI research institutions, a globally respected creative workforce, and a blending of technology and storytelling expertise. By helping workers transition into AI-augmented roles, the UK can future-proof its creative workforce and maintain its competitive edge in the global screen economy.

Drive increased public understanding of AI use in screen content
Transparency will drive audience trust in the age of generative AI…National institutions such as the BBC are already experimenting with fine-tuning AI models to reflect their editorial standards, and the BFI is deploying AI in archival work with a focus on ethical and transparent practices.  
These efforts demonstrate the UK’s capacity to lead in setting audience-facing standards and educating the public about generative AI’s new and developing role in content creation.

Unlock investment to propel the UK’s high-potential
There is a compelling opportunity and a pressing need for targeted financial support for the UK’s creative technology sector…the House of Lords has identified a “technology scaleup problem” in the UK, with limited access to growth capital, poor infrastructure, and a culture of risk aversion acting as barriers to expansion.

Empower UK creatives to develop AI-supported creativity
Generative AI is lowering traditional barriers to entry in the UK screen sector – enabling individuals and small teams to realise ambitious creative visions without the need for large budgets or studio backing. 
By investing in accessible tools, training, and funding for independent creators, and developing market-preferred, ethical AI products, the UK can foster a more inclusive and dynamic creative economy where AI enhances, rather than replaces, human imagination.

 

The report is published by the BFI as part of its role within the CoSTAR Foresight Lab. CoSTAR is the UK’s first national lab for creative industries’ research and development, funded by the government-backed UK Research and Innovation’s Infrastructure Fund. 

It is authored by Angus Finney, Brian Tarran and Coupland and draws on published reports and research, responses to public consultations, surveys of screen sector organisations and creative technologists, and interviews with key stakeholders. 

BFI