July 24, 2025

Whose voice is it anyway? Navigating performer rights in the age of AI

microphone recording studio

Recently, there have been increasing reports of artists discovering that their performance has been manipulated using AI without their knowledge to generate “synthetic media”.  This is proving to be a challenge not just for the performers, but also for businesses who are unsure whether they have the necessary rights to use content that has been created for them.

A notable example occurred earlier this year, when Scottish voiceover artist Gayanne Potter alleged that old voice recordings she had made for a Swedish company called ReadSpeaker had been used without her knowledge to create the voice of ScotRail’s AI announcer ‘Iona’. It is not clear whether ReadSpeaker was actually at fault or had done anything unlawful. However, the incident did prompt Scottish First Minister John Swinney to announce that Scotrail was “fixing” the issue.

This article discusses the law in the UK related to performers’ legal position, rights of personality, and the risks of synthetic media.

What the law currently says (and doesn’t)

In the UK, individuals have no standalone right of personality. Anyone wishing to claim that someone has used their image or likeness without consent would have to rely on a patchwork of legal rights, including copyright, privacy rights, passing off, defamation, and contract law.

A performer who wanted to bring a claim against a third party who used their performance to create synthetic media would likely need to rely on one of the following:

• Privacy and misuse of private information. A performer could potentially bring a privacy claim based on the tort of misuse of private information, particularly if the synthetic media was created in a way which intruded on the performer’s private life.

• Defamation. If the synthetic media portrays the performer in a damaging light which harms their reputation (e.g. deepfakes which show celebrities endorsing controversial political causes) a defamation claim may be possible.

• Data Protection law. If a performer’s image or voice is used in the creation of synthetic media, this may be considered to be personal data. Under the UK General Data Protection Regulation, Performers have certain rights to object to the processing of their personal data; to have their personal data erased; and to make a claim in court to receive compensation for damage caused by the unlawful processing. There has been some discussion around whether a performance might be deemed to be “special category data” under the UK GDPR on the grounds that it is biometric data. Special categories of personal data have a greater level of protection afforded to them under the UK GDPR, which would be helpful to performers. In particular, a data controller would typically require the informed consent of the data subject in order to process special category data. However, the ICO makes it clear that although a voice recording or an image of an individual’s face may be biometric data, not all biometric data is special category data: biometric data will only deemed to be special category data if it is used for the purpose of uniquely identifying an individual. This is not necessarily the case when it comes to use of performances in synthetic media.

• Passing off. Public figures, actors, celebrities, influencers, and other well-known figures may be able to rely on passing off. To be able to show that passing off has occurred, the performer would need to show that they had goodwill in their likeness, image or voice (so they would need to be well-known) and there would need to have been a misrepresentation, which led the public to believe there was a connection with the performer and a particular product or service.

• Performer rights. It may be possible to rely on performers’ rights under the Copyright Designs and Patents Act 1988.

• Copyright law. If the copyright in the performance was owned by the artist, or a third party, and the synthetic media was created using a substantial part of the copyright work without permission, then this may infringe the rightsholder’s copyright.

None of these options provides performers with reliable, standalone means of protecting their performance. There is very little case law in the UK relating to these issues, and any remedy would likely need to be pursued through the courts – something which many performers are naturally reluctant to do.

Post-mortem rights

In the UK, it is even more challenging for the family or estate of a deceased performer to bring a successful claim to prevent use of their voice or image. Data protection rights only apply to living individuals, and it is not possible for the estate or family of a deceased person to bring a defamation claim on behalf of someone who has died.

Could we see a new era for standalone personality rights in the UK?

The idea has been mooted, but it seems unlikely that this would happen any time soon. The UK Government AI & Copyright Consultation (the “Copyright Consultation”), which closed in February, briefly discussed whether a new “right of personality” should be introduced in the UK to better protect an individual’s voice or image. This would presumably have some similarity to the rights of personality that currently exist in certain US states. However, at the moment, the issue has only been discussed a high level, with no concrete legislative proposal about what this might look like or how this might work in practice.

There would be some significant challenges with this approach – for example, defining the scope of ownership of those types of rights, as well as the risk of overlap with other existing intellectual property rights. Perhaps most challenging for legislators is how to foster a pro-AI culture of innovation whilst at the same time balancing individuals’ rights. Any legislative proposals would need to be aligned with the government’s plan to “shape the AI revolution”, as set out in its AI Opportunities Action Plan, published earlier this year.

The Copyright Consultation did not dismiss the concept of a standalone personality right, stating that: “The legal landscape here is complex and extends beyond intellectual property rights. The introduction of a new type of intellectual property protection for personality in the UK would be a significant step and requires careful consideration. However, the government recognises that some individuals wish to have greater control over whether content can be created that includes their personality or likeness and takes their concerns seriously.”

Synthetic media: handle with care

The uncertain, legal landscape does nothing to help businesses who may not even know whether content that has been created for them is synthetic or not. There is a lack of clarity around whether or not (and, if so, how) businesses should be routinely checking whether the necessary rights have been obtained from the performer in relation to synthetic media created by their own employees, or by third party agencies they use to create content for them.

For international businesses, the challenge is even greater, due to the inconsistent nature of rights in different jurisdictions. For example, the US states of California and Tennessee both have a form of protection for rights of personality, as do France, Germany, China and India. What might be legally compliant in the UK, may not be in other countries.

As well as being unclear from a legal perspective, there is also a reputational risk, as was demonstrated by the Gayanne Potter incident, which attracted significant negative media attention for Scotrail.

Looking ahead

What’s clear is that the law has not been able to keep pace with the technological leap forward brought by AI. Legislative action is ultimately likely going to be needed to address these issues. In the meantime, these are the key points:

• Performers need to be alive to this issue. They need to review contracts, and scopes of work carefully. Look to include provisions that expressly deal with use of AI applied to their recordings / video. If use of a performance to create synthetic AI is included, consider whether the fees reflect this.

• If you are a business engaging a supplier to provide content, you need to be aware that the legal position is evolving. You need to make sure your contracts are robust and include appropriate warranties covering use of synthetic AI. Even if you were unaware that AI had been used in content created for you (e.g. because it was created by a creative agency) it will be your name and reputation that is in the spotlight.

• If you are a creative agency providing content, it’s important to understand the risks involved in using synthetic media. If you are intending to use a performance to create synthetic media using AI, make sure your contracts with performers and artists clearly articulate this, so that everyone knows where they stand. Make sure your end clients understand that you are intending to use AI to create synthetic media and be prepared that you may be asked to give warranties around its use.

• Be aware that the law in this area is rapidly evolving. Follow our legal updates to stay informed.

Meet the author

Contact Mike if you have any questions about performer rights in the age of AI.

Read more from around RWK Gooodman

View more articles related to Commercial, Corporate, Intellectual Property and Tech