Narrators call for laws to prevent losing their voice

Jennifer Dudley-Nicholson |

Thousands of fiction and non-fiction works are being turned into audiobooks using synthetic voices.
Thousands of fiction and non-fiction works are being turned into audiobooks using synthetic voices.

If you have noticed something different in the pitch, tone, cadence or emotion of audiobook narration lately, perhaps that is because the person reading your novel is not a person at all. 

Thousands of fiction and non-fiction tomes are being turned into audiobooks for major platforms using synthetic voices created by tech giants such as Apple, Amazon and Google. 

Some of these virtual voices have even been designed to narrate specific literary genres, like ‘Madison’ who was created to read romance novels and ‘Mitchell’ who specialises in self-development books. 

A person wearing headphones.
More than 5000 Australians work as voice actors or voice-over artists on projects like audiobooks. (AP PHOTO)

But while experts say the artificially intelligent creations can make some books more accessible, others warn they have the potential to steal work from humans in literature and other fields. 

New laws, they say, are urgently needed to protect voice actors from having their tools used without their consent or compensation. 

Australian Association of Voice Actors president Simon Kennedy says more than 5000 Australians are employed as voice actors or voice-over artists for everything from TV and radio commercials to animated TV shows and audiobook narration. 

But he says those jobs are increasingly being put at risk by the unethical use of artificial intelligence technology. 

“There are instances right now of people who are either losing work to generic, AI-generated voices and some who are losing work to synthetic clones of their own voice which have been obtained without their consent,” he said.

“At the moment, there’s no law against it because no Australian owns their voice and that is where we see a gaping hole in regulation.”

While using a virtual voice rather than a human to narrate an audiobook could save creators a one-off fee, he says, it also puts at risk the work of audio engineers, casting agents, and studios. 

Mr Kennedy says two young Australian voice actors recently reported losing work through the use of AI, including one man whose video narration was cloned by a rival production, and another man whose employer used samples of his voice to create a digital copy.

Books on a shelf
Using a virtual voice to narrate a book for an audio experience has effects beyond cost savings. (Lukas Coch/AAP PHOTOS)

“The producer of (an animated show) had employed this actor to do character work for the series over a number of episodes then – when they worked out how to duplicate their voice using AI technology – they continued to create episodes without him in the loop,” he told AAP. 

“They used his voice, continued to use the character he had created, but they cut him out of the whole process.”

Mr Kennedy says the association is advising actors to read contracts carefully to avoid accidentally signing their voice rights away, but warns Australia also needs laws to prevent voice theft. 

Regulations, he says, should ensure artists must give explicit consent before their voice can be re-used, provide control over how their voice is used, and ensure employers compensate them for potential loss of work. 

The call is similar to a warning issued by Australian Council of Trade Unions assistant secretary Joseph Mitchell, who told the Senate’s inquiry into Adopting Artificial Intelligence that, without regulation, the technology could delete jobs.

“The theft of voice, body and movement is something acutely felt by creative workers,” he said. 

“The ownership of their creative and cultural capital is paramount and must be protected by law.”

But major tech platforms are increasingly accepting digital voices and even providing them for others to use. 

Both Amazon and its subsidiary Audible list thousands of audiobooks that have been narrated using AI software. 

Beats headphone against an Apple logo.
Apple has created five digital voices for use in the creation of audiobooks. (EPA PHOTO)

The listings note “this title uses virtual voice narration” and potential buyers can try a five-minute sample to judge whether it’s up to their listening standard. 

Google also provides a long list of virtual narrators to read e-books aloud, including three with an Australian accent, while Apple created five digital voices for use in the creation of audiobooks, each designed to suit different literary genres. 

In a statement, the company said the virtual voices were designed as a complement to human voice-over artists and the company “remains committed to celebrating and showcasing the magic of human narration”.

Even Project Gutenberg, a volunteer effort to digitise public domain books, collaborated with Microsoft and the Massachusetts Institute of Technology to give digital voices to more than 5000 titles.  

University of NSW AI Institute chief scientist Toby Walsh says using this technology to narrate books is plagued with ethical and moral considerations as some people benefit while others miss out. 

University of NSW Professor Toby Walsh
Toby Walsh says Australia needs laws to ensure audiences are told about what they are listening to. (Julian Smith/AAP PHOTOS)

“On the one hand, it’s a positive for people because you can now offer audiobooks in any possible language anyone could want and that’s going to improve accessibility and remove the Tower of Babel that has hindered the communication of ideas,” he said.

“On the other hand, if you were someone who used to be paid to voice those sort of things, I’m somewhat fearful that your income will completely disappear.”

Prof Walsh says some issues related to AI-created audiobooks will be resolved with digital watermarks in future so listeners can be confident about their origin but, in the meantime, Australia needs laws to ensure audiences are informed and jobs are protected.  

“We’re used to believing thing that we hear, believing the things that we see, and we are now in a world in which the things that we see and things we hear are no longer true,” he said.

“There are new harms that AI is bringing and we need regulation and legislation to deal with it.”

Former Facebook Australia chief executive Stephen Scheeler, who recently addressed Flight Centre’s roadshow on ethical AI use, says AI-narrated audiobooks were another example of using the technology to “experiment in real-time” without considering all consequences. 

Mr Scheeler says while Australian consumers and businesses are proving cautious about the use of AI so far, legal protections will be needed to ensure computers do not overtake creative content. 

“We need to be overly protective of human creativity and ingenuity, even at the expense of creating amazing AI tools,” he said. 

“Whether it’s good for business, it’s bad for humanity if we allow too much licence to use human intellectual property willy-nilly.”