Skip to main content

Long read: How TikTok's most intriguing geolocator makes a story out of a game

Where in the world is Josemonkey?

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

UK actors and Equity are battling the rise of AI-driven deepfake mods

Where performers' voices are used without permission.

Jane Perry and David Menkin headshots
Image credit: Luce Newman-Williams / Michael Shelford

"Is my voice out there in the modding and voice-changing community? Yes. Do I have any control over what my voice can be used for? Not at the moment."

David Menkin most recently starred in Final Fantasy 16 as Barnabas Tharmr, following work in Lego Star Wars, Valorant, Assassin's Creed and more. And like many other voice actors in the games industry, he's concerned about the increasing use of AI.

"AI is a tool, and like any tool, it requires regulation," Menkin says. "We simply want to know what our voices will be used for and be able to grant or withdraw consent if that use is outside the scope of our original contract. And if our work is making your AI tool money, we want to be fairly remunerated."

That certainly seems a fair request. But many believe regulations around AI are far behind the speed at which the technology is progressing - something the UK actors' union Equity is now making moves to change.

Newscast: Is Microsoft's Activision Blizzard acquisition now a done deal?Watch on YouTube

The use of AI in the development of games is already a huge topic - as Eurogamer's Chris Tapsell reported on from GDC, especially in the art and script-writing spaces. For voice actors, it presents a range of problems - including the worrying trend of using actors' voices in deepfakes without consent.

A tweet earlier this month from Skyrim modder Robbie92_ highlighted the issue of actors having their voices used by modding communities for pornographic content. "As a member of the Skyrim modding scene, I am deeply concerned at the practice of using AI voice cloning to create and distribute non-consensual deepfake pornographic content," they wrote, including a list of affected performers.

"Their performances have been nefariously repurposed, fed into AI cloning tools, and used to create deepfake voice clips of these performers reciting explicit dialogue that is pornographic in nature. I believe the creation and distribution of deepfake pornography is unabashedly evil, and that we as a community have a responsibility to act."

Some of these voice actors responded incredulously to the tweet, seemingly without knowledge their voices had been used. "What the shit?!" replied Mortal Kombat voice actor Richard Epcar.

The tweet specifically calls out ElevenLabs, an AI cloning tool that - according to its terms of service - specifies users are either the creator and owner of files used, or have the written consent of every identifiable individual person in the files. ElevenLabs did not respond to Eurogamer's request for comment.

Clearly consent is not always given.

After being made, mods with AI-cloned voices are then hosted on popular mod site Nexus Mods. When Eurogamer contacted Nexus Mods, we were directed to an article from April detailing the site's stance on AI-generated content in modding.

"If you as a content creator decide to include AI-generated content in your modding projects it doesn't break any of our community rules," it reads. "However, if we receive a credible complaint from a party who feels your work is damaging to them - this could be the voice actor or the body which owns the rights to the character/assets - we will remove the content if appropriate.

"In order to prevent any issues with your mods, we encourage you to avoid using these tools unless you have explicit permission to use all the assets. This is particularly true for AI-generated voice acting but also covers images, models, code and anything else that you can use this technology to create."

But this leaves the onus on actors themselves to dispute the use of their voice in unlicensed game mods, rather than - at present - any wider regulation.

This is where Equity UK has stepped in. Last month, Equity released its "ground-breaking AI toolkit to protect performers from a surge in unregulated technology".

That toolkit has been created in partnership with intellectual property expert Dr Mathilde Pavis, which sets out an "ethical use of AI" and provides templates for artists to enforce their legal rights.

Further, Equity has called for urgent government action against what it calls "performance cloning", defined as: "the process of creating a synthetic performance by recording, using or reproducing the performance, voice or likeness of an Artist by machine learning systems and equivalent technology".

"Equity is taking action and giving our members the tools they need to safeguard their legal rights."

"With use of AI on the rise across the entertainment industries, Equity is taking action and giving our members the tools they need to safeguard their legal rights," said Equity's industrial official for new media Liam Budd.

"We are proud to be leading the way by producing a ground-breaking template AI contract and setting out new industry standards. Whilst Equity will continue to engage with producers across the entertainment industries, the government needs to step in with robust measures to properly regulate the growing use of AI."

Back in March 2023, the Intellectual Property Office committed to produce a code of practice for generative AI by this summer. The government is also consulting on a pro-innovation approach to AI regulation, as outlined in a recent white paper.

For Equity, the Vision Statement in its toolkit sets out new industry standards, stating artists have the right to:

  1. consent (and not consent) for past, current and future performances
  2. licence their performance or likeness on personal, non-exclusive, time limited basis
  3. be identified and object to derogatory treatment of their performance and work
  4. transparent information
  5. meaningful consultation
  6. fair and proportionate remuneration
  7. equal access and treatment
  8. be engaged under a collectively bargained agreement

Menkin is part of a team of actors working with Equity to raise awareness in this area.

"Last year, we approached UKIE with clear examples as to why we need industry-wide regulation, updated copyright laws, a clearer intellectual property framework and codes of practice, but the response has unfortunately been lacking," he says.

Since then, motions to put more focus on games and AI were put forward at this year's Equity Conference and passed with majority votes.

"The work is being done, but legislation takes time," Menkin says. "In the meantime, we need to get educated and that's why Equity released the AI Toolkit, with guides on understanding the AI landscape and our rights, as well as templates for contracts when the language from our clients and employers isn't clear enough."

Similar initiatives are also launching around the world, including in America with SAG-AFTRA and The National Association of Voice Actors (NAVA).

Menkin is adamant about increased regulation for the use of AI in gaming.

"We are, more often than not, told that there is no option but to sign away all of our rights without any clear idea what exactly our work will be used for."

"Games and AI believe in self-regulation," he says. "Companies often speak about diversity drives and codes of practice, but self-regulation rarely benefits those on the lower rungs of the ladder, especially sub-contractors like actors, writers, composers, etc. We are, more often than not, told that there is no option but to sign away all of our rights without any clear idea what exactly our work will be used for.

"We can no longer accept those terms, in the age of AI."

BAFTA award winning voice artist Jane Perry, famed for her work in Returnal and Hitman, similarly believes a governing body is required to oversee the use of AI.

"I am aware of the support on offer via Equity. I think they are doing the best they can, within the legislative framework that exists here in the UK. We are not able to call for industrial action, as our American counterparts are," she says.

"I feel that whilst we are offered directives on what to watch out for in the contracts we sign, and are offered some guidance as to copyright laws etc, I feel what is missing, and this is crucial, is some sort of governing body that can effectively oversee the use of AI and to monitor its use."

She adds: "The use of AI has moved faster than our ability to create boundaries around it. Given the lack of resources at British Equity, there is no way they can operate as this governing body. It would be great for our government to act fast and provide assistance with this."

Perry remains concerned about the future of AI and what it will mean for voice actors. Her experience in gaming includes interacting with the industry's passionate community, but this would become obsolete should AI voice acting take off.

"As an actor, it's a pleasure to answer these questions and address what seems to be a very compelling interest in what goes on behind the scenes," she says.

"The sound of a voice has a biography in it, of which the listener may or may not be aware. And yet, it won't be me."

"I often wonder, when the time comes that 'my' characters are voiced by an AI generated version of my voice, who will answer these questions? The characters' choices and expression of hopes, fears and desires will not be informed by my own lived experience. I will not have the opportunity to search in my own heart and soul for how to play a scene, or how to set up and respond to the relationships established in the game's narrative.

"I will have had seemingly no input into character creation. And yet, it will be my voice, the sound of which resonates and chimes in relation to all that has happened to me over the course of my whole life. The sound of a voice has a biography in it, of which the listener may or may not be aware. And yet, it won't be me."

She adds that AI is "not all bad" and is suited to certain other jobs and industries. "But we have to question this desire to eat our own tail," she says.

"I get that it's fun and exciting to play with AI and to see how far we can get in terms of playing around with people's voices and faces. But let's be present to the hard fact that taking work away from humans will cause all manner of problems: financial issues being one of them, and having a sense of purpose being another. We are seeing this already. In the meantime, profit margins increase for already bloated corporations."

There is more work to be done, then. But Equity - and other unions - are beginning to appreciate the real danger to actors that AI represents. Yet until proper regulation is enforced, responsibility sadly lies with actors to protect themselves.

Read this next