Digital replica expert sounds alarm on voice artists being ripped off by AI companies
Warning follows case of Gayanne Potter who is now competing with her AI clone
A LEADING EXPERT in digital replicas says creatives need to demand clauses in contracts that prevent their recorded performances being used for AI model training and voice cloning. Dr Mathilde Pavis, legal adviser to performing arts trade union Equity, said she’d noticed a “troubling pattern” emerge among voiceover artists who’d agreed to lend their voices to AI projects only to later find they had been “cloned, commercialised and used far beyond what they agreed to”.
Mathilde is speaking out after several voiceover artists came forward to share their experiences of being digitally cloned without giving their full consent. They include Gayanne Potter, one of the creative industry’s most prolific and versatile voice actors whose performances introduce news bulletins and grace commercials broadcast several times a day across the UK.
During the COVID pandemic Gayanne worked on a project with ReadSpeaker, a Swedish text-to-speech specialist. Gayanne believed her voice was recorded to help people with visual impairments but recently discovered a cloned version was being used by an AI-generated announcer on Scotland’s train network, ScotRail, without her consent. “That left me feeling distressed, violated and very angry,” Gayanne told Charting Gen AI.
“As a creative industry we have been lured and misled into working with companies like this under the guise of ‘accessibility tools for people with visual impairments’. It’s disgusting.”
Gayanne turned to Equity for help. Liam Budd, Equity’s industrial official for recorded media, told Charting: “It is extremely exploitative for companies to use and commercialise voice recordings to create digital replicas of artists from contracts which pre-date the development of generative AI or were not drafted explicitly for this purpose.
“Gayanne is directly competing in a marketplace with a low-quality clone of her own voice that she claims was developed without her informed and explicit consent. Not only is this distressing for her, but it would also represent an infringement of data protection and other rights. The union is exploring all avenues that we can take to protect our members from the misuse of AI in relation to past, present and future contracts.”
ReadSpeaker marketing chief Roy Lindemann last month told Sky News that Gayanne’s concerns over the sale of her voice had been “comprehensively addressed” with her legal representative.
That legal representative is Mathilde, who told Charting: “Gayanne is the canary in the coalmine, and her story is no longer rare. Creators are being brought into projects under vague terms, with little transparency and no real understanding of how their voices or likenesses will be cloned, stored, and used.
“Being told after the fact that a contract gives someone the right to use your digital clone in perpetuity, without your fully informed consent, is not just unethical. It breaks trust and, in many cases, breaches data protection laws like GDPR.”
Mathilde said Gayanne’s case “should be a wake-up call to every creator, freelancer, or media worker signing contracts today”. “Always ask for an AI clause in your contracts explicitly excluding use of your recordings for training models or voice cloning. Don’t assume good intentions protect you. Only clear legal language does,” added Mathilde, founder of digital replica advisory Replique.
Gayanne’s advice to fellow voiceover artists is “don’t be silent”. “Speak to a lawyer, speak to Dr Mathilde Pavis. There are things that can be done. Companies like ReadSpeaker must be held accountable. If someone asks you to stop using their biometric data then the right thing to do is to stop. That’s the honest and ethical thing to do.”
She added: “We will not rest until this is resolved.”
◼️Have you had a similar experience? Let us know in the comments below.