Why the SAG-AFTRA contract might let AI kill voice acting


When SAG-AFTRA members cheered their acting union’s contract win, Jesse Inocalla’s voice didn’t raise in celebration with the rest. 

Despite the historic agreement, wrestled out of Hollywood studios after a 118-day strike, Inocalla was simply more worried than ever because to his eyes there was the equivalent of a contractual time bomb buried in that achievement. 

“Not to be too fearmongering about it, but there are definitely huge swaths of the industry that will be affected by this,” he said.

Inocalla was referencing one of the most hotly contested aspects of the contract: Generative artificial intelligence (AI), the technology behind everything from ChatGPT to the deepfake song Heart on My Sleeve with computer-generated voices of Drake and the Weeknd. It uses machine learning on vast amounts of data to produce high-quality original text, graphics, sounds, images and videos.

And while it has disrupted multiple art industries like music, visual arts and literature in the past two years, Inocalla says generative AI poses a special threat to performers like him.

“For a lot of voice actors, (generative) AI has been the canary in the coal mine that we’ve been shouting about for years,” he said.

WATCH | Deepfake Drake and The Weeknd song shocks music experts: 

Viral ‘Drake’ and ‘The Weeknd’ deepfake shocks industry experts

A song said to be by Drake and The Weeknd racked up more than 600,000 streams on Spotify alone before one of the labels had it pulled. Claims that the music was generated by artificial intelligence are creating waves in the music industry.

Because of how voice actors are positioned in the industry, he says they’re at most risk of being taken advantage of by movie studios using the contract’s new language. 

“A lot of companies are looking to make the bottom line, and if they can spend $100 on licensing Male Voice No. 3 off of (text-to-speech platform) ElevenLabs instead of paying $500 to a living, breathing voice actor … then they’re going to do that,” said Inocalla.

Voice clones and contracts

Those fears come from two places: Both in practises already seen in the entertainment industry before the agreement was reached, and by language in the contract itself. 

Inocalla says it has already affected him.

Though rumblings of live-action actors being fully recreated to appear in productions  — like Magic City Films’ never-realized plan to have James Dean digitally resurrected to play a role back in 2019 — deepfakes in general have popped up in increasing frequency, and it is still easier to recreate a voice than a person’s entire physical likeness. 

For that reason voice actors, whose jobs consist of creating thousands of publicly available examples of their performance, are often targeted by generative AI. Inocalla says he found a cloned version of his voice from an episode of My Little Pony on a fan website, while the cloned voices of actors from Spongebob have proliferated enough to create their own AI-based rap battle subgenre

While those are not done by studios, Montreal voice actor Tod Fennell says it proves the ease of creating artificial voice actors — and the lack of protest from audiences with the result. 

“As a voice actor, … you used to feel competition from other voice actors and you all kind of want to get better. Now I’m literally feeling the push from AI,” he said.

Combined with the fact that voice actors are much less likely to gain widespread recognition than actors whose faces are regularly seen, Fennell said they are easier for studios to replace while avoiding any backlash. 

“We’re trying to get to the next level and get really, really good so that the audience will hear the difference,” he said.

Fennell says that leads to worries some voice actors have with the contract itself. SAG-AFTRA’s tentative agreement with the Alliance of Motion Picture and Television Producers (AMPTP) breaks down the way studios can use generative AI into different categories. 

Contract outlines different AI categories

The three that could most directly affect them are employment-based digital replicas, independently created digital replicas and synthetic performers. 

The first is a digital likeness created from an actor already working on a project — where a studio would hire an actor who acted in most of their scenes, and has a sort of computerized stunt-double act in additional scenes for them. 

The second is a digital actor based on a performer, but who wasn’t actually hired for the project. That could be used for studios to, for example, create an AI version of Bugs Bunny voice actor Mel Blanc to continue voicing Looney Tunes long after his death. While studios would need to bargain with his estate or the union to do that, Inocalla noted Canadian actor Eric Bauza — who took over many of Blancs roles in recent years — may never have had that opportunity if it had been possible. 

WATCH | Eric Bauza is the Canadian voice behind Bugs Bunny

Canadian behind the voice of Bugs Bunny

Eric Bauza grew up in the Toronto suburb of Scarborough, Ont., but lives in Los Angeles, where he is the voice of Bugs Bunny, Daffy Duck, Tweety Bird and Marvin the Martian for Looney Tunes Cartoons. Bauza has also found a way to keep connected to his Canadian roots while living in the U.S.

The third category is much what it sounds like: Fully synthetic actors, not made to resemble any specific person. 

Michael Duboff, an entertainment lawyer at Edwards Creative Law, said he and his colleagues have been preparing for a predicted onslaught of legal fallout caused by these rules. While the language puts up guardrails against generative AI-abuse that weren’t previously there, much of the contract is propped up by language instructing the two parties to “acknowledge the importance of human performance in motion pictures” and to act in “good faith” toward one another — requiring studios to, essentially, promise to behave nicely.

“How do you actually implement that, and protect that? Acknowledging something doesn’t really mean much at the end of the day if there isn’t a resulting action that comes from that,” Duboff said. “So I have no doubt that there will be more battles to be fought from this.”

And those battles could very well start around voice acting because while the contract states studios need to get consent from performers before creating digital likenesses of them — a requirement that Fennell and Inocalla say will have little effect in an industry with thousands of people auditioning for a single job — there are other loopholes. 

A summary version of the contract states that actor’s consent isn’t needed for using generative AI to change “the voice of the performer to a foreign language.” Even outside of the risk of synthetic performers taking roles in animated movies, Fennell and Inocalla say that poses a danger for a huge avenue of work for voice actors called dubbing. 

Renée Desjardins, an associate professor at Université de Saint-Boniface and translation researcher, says that’s an even bigger possibility now as translated media is growing in the wake of things like Squid Game and Parasite, creating a boom in multilingual content in media. 

She said the possibility of studios relying on some form of artificial intelligence instead of human voice actors would follow a trend translators have been observing for decades. 

WATCH | South Korean entertainment is huge in North America. Here’s how it happened: 

How South Korean entertainment became a global phenomenon | About That

From Parasite to Squid Game, South Korean content is dominating the global media landscape. With Netflix announcing a massive $2.5B investment in South Korea, Andrew Chang discusses how the country has developed its media sector, and how it can be used as a diplomatic tool.

“There’s something a little paternalistic or infantilizing to suppose that AI will always be better and is what the audience wants,” she said, adding it’s a general trend professional translators have contended with since computer-based machine translation research began shortly after the Second World War.

“Big tech seems to think that translation is always a problem to solve, but never consults — or rarely consults — the language industry and the end users.”

Inocalla and Fennell say they are both afraid of that possibility. And while this contract does not specifically deal with animated TV productions and video games — as those agreements will come separately at a later time — similar wording there could represent further hurdles for them. 

And the more that acting roles are taken by AI, they both noted, the less likely any future strikes will be able to achieve what this one did.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *