AI voice generators spark discussion about voice ownership

Photo by: Jonathan Farber

As the use of AI speech generators becomes more popular, the concerns surrounding the technology are similarly widespread.

Recently, there’s been an influx of videos starring prominent figures such as Joe Biden, Donald Trump and Joe Rogan, sometimes all together in conversation, doing things like playing video games. The audio for these videos is created on websites hosting text-to-speech programs, with the voices created by AI software. These programs receive audio clips of a person, which they then interpret to find patterns in that person’s speech and create a program to mimic that person’s voice.

While these programs and the videos they enable have been popular, there has been rising concern regarding the ethics of these programs and the problems they create.

On Feb. 10, actor Steve Blum tweeted stating that he had not given permission for these programs to copy his likeness and that he did not approve of the practice.

“Hey friends, I know AI technology is exciting, but if you see my voice or any of the characters that I voice offered on any of those sites, please know that I have not given my permission, and never will,” he said. “This is highly unethical. We all appreciate your support. Thank you.”

His sentiments were echoed by other actors such as Cristina Vee Valenzuela, SungWon Cho and Jennifer Hale. While some sites such as Uberduck and Storyteller have agreed to blacklist actors uncomfortable with having their likeness on their site, this agreement does nothing to address users who may have downloaded these voices before the site stops hosting them.

Additionally, such agreements are based solely on the good faith of the hosting sites, as the actual laws surrounding this practice vary around the world. According to Forbes, while many courts in the United States treat the right of publicity as personal property, any actors living in states where the courts do not treat it this way are susceptible to the AI treatment, with little to no countermeasures available. 

Beyond just memes, though, this technology has started to become an issue in the voice acting field directly. Italian voice actors are currently on strike in part due to the encroachment of AI on their voices, according to the head of Italy’s dubbing directors’ organization (ADID), Rodolfo Bianchi.

“We are forced to sign contracts in which we give away the rights to the use of our voice,” Bianchi said. “In current agreements, this also involves the use of our voice for artificial intelligence purposes.”

While these concerns may seem to only be relevant to voice actors, this issue has ramifications outside of acting. The more people use these programs, the better they become at synthesising new voices. 

This applies to both the quality of the voices as well as the effort to create them; already, the amount of audio input a program needs to recreate a voice has been reduced to just 30 seconds. And as this technology improves, it has a real chance to hurt people, as was the case with Regina resident Ruth Card.

In March 2022, Card received a phone call from what sounded like her grandson. The caller claimed that he was in jail after rear ending another car, and that if he could pay for the damages he caused they would let him go. 

He needed his grandma to get the money for him, as his wallet and phone had been confiscated. As Card was withdrawing the money, a teller informed her that the caller was likely not her grandson, but a scammer using AI to mimic her son’s voice.

Though this new technology is exciting, it currently lacks the safeguards and regulations needed to prevent people from abusing it. These abuses range from making celebrity voices perform hate speech, to calling for a fake draft, to tricking citizens into giving away their money. 

While vocal impressions have existed far longer, these programs are not tied to any one person, meaning they can be repeatedly misused even as individual bad actors get caught. It, like other forms of AI art, will likely cause far more problems than solutions until these issues are addressed.

Leave a Reply

Your email address will not be published. Required fields are marked *