UK Property

AI Companies Will Never Have to Compensate Creatives


A special adviser to the UK’s Technology Secretary has claimed that artificial intelligence companies “will never legally have to” compensate rights holders for the training data used in their models.

Kirsty Innes posted the comments on X in February but has since deleted them. Other deleted posts from Innes, as reported by The Guardian, stated that the scrapping of intellectual property “can continue to happen outside the UK, whatever our laws say.” She also admitted that this “might be a bitter pill to swallow for some.”

The adviser to Liz Kendall, Secretary of State for Science, Innovation, and Technology, made these posts seven months before taking the position. Innes previously worked for the Tony Blair Institute, a think tank focused on policy and governance reform, which has received hundreds of millions of pounds in donations from the co-founder of enterprise tech giant Oracle.

Oracle has recently inked a $300 billion partnership with OpenAI and is also collaborating on the US AI infrastructure project Stargate. OpenAI has been vocal in its disapproval of the UK government’s proposal to allow creators to opt out of having their IP used as AI training data.

Innes and Kendall have declined requests to comment from The Guardian.

UK government is still making up its mind on how to appease creatives

The UK government is currently reviewing the results of a consultation examining potential ways to balance the protection of rights holders with the advancement of AI innovation. One of the proposals discussed was to allow AI developers to train their models on creators’ online content by default unless rights holders explicitly opt out.

Bodies representing the creative industries have largely rejected this proposal, as it puts the onus on creators to exclude their content rather than requiring AI developers to seek consent. Tech companies say it would complicate identifying legally usable content for commercial AI training and that they’d rather have unrestricted access to all of it. Policy experts argue that allowing some creators to opt out would lead to biased models.

Meta’s former global affairs chief and former UK Prime Minister Nick Clegg said it was “implausible” to seek permission from every artist given the vast scale of data used to train AI models. He added that doing so when no other country does would “kill” the UK’s AI industry. US President Donald Trump has also called it “impractical.”

Back in May, sources told The Guardian that former Technology Secretary Peter Kyle had moved away from the idea of an opt-out system and was leaning more toward encouraging licensing agreements between AI companies and creators. The government is wary of making laws too restrictive for tech companies, as it still wants to attract their investment; therefore, it is primarily discussing the matter with representatives from US AI companies.

Labour’s stance seems to lean toward growth and innovation rather than protecting artists

Last week, ahead of Trump’s visit to the UK, more than 70 artists and organisations — including Sir Elton John, Kate Bush, Sir Mick Jagger, and Sir Paul McCartney, a vocal campaigner for artists’ rights — signed an open letter accusing the Labour government of failing to uphold international and UK human rights law in protecting copyright from AI misuse.

The government’s approach to AI appears to be more focused on economic growth than on mitigating risks. In January, Prime Minister Keir Starmer released the AI Opportunities Action Plan, which put innovation front and centre. The plan proposed creating a “copyright-cleared British media asset training dataset,” but many in the creative sector were disappointed by its silence on protections for rights holders.

In June, Parliament passed a bill allowing AI models to be trained on copyrighted material without the rights holders’ knowledge. Members of the House of Commons argued that requiring transparency would discourage companies from developing and releasing AI products in the UK, claiming disclosure requirements would create undue burdens and expose proprietary data sources.

Disagreements over how to resolve the fundamental tension between innovation and creative rights are not exclusive to the UK. Anthropic, Meta, Perplexity, Stability AI, Midjourney, and OpenAI (many, many, many times) — as well as Microsoft itself — are among the AI developers that have faced legal action from artists, news outlets, and musicians for using their work without consent.

Meanwhile, companies like Microsoft and Cloudflare are looking to put the power back in creators’ hands by offering solutions that allow them to sell their content on a per-use basis.



Source link

Leave a Response