Amazon Alexa unveils new technology that can imitate sounds, including those of the dead

Amazon Alexa unveils new technology that can imitate sounds, including those of the dead
Placeholder while loading article actions

bolster above a beside the bed Schedule During this week’s Amazon tech summit, the Echo Dot was asked to complete a task: “Alexa, can Grandma finish reading me?” The Wizard of Oz’?”

Alexa’s usually cheerful voice spread from the baby-themed, panda-themed smart speaker: “Okay!” Then, as the device began recounting a scene of a cowardly lion begging for courage, Alexa’s robotic chime was replaced with a more human-looking reader.

“Instead of Alexa’s voice reading the book, it’s the baby’s grandmother’s voice,” Rohit Prasad, Alexa’s senior vice president and chief AI scientist, explained enthusiastically on Wednesday during a keynote address in Las Vegas. (Amazon founder Jeff Bezos owns The Washington Post.)

the offer It was the first glimpse of Alexa’s latest feature, which – although still in development – will allow the voice assistant to repeat people’s voices from short audio clips. The goal, Prasad said, is to build greater trust with users by infusing AI with “the human traits of empathy and influence.”

The new feature can ‘make [loved ones’] Flashbacks,” Prasad said. But while the prospect of hearing the voice of a dead relative may be very moving, it also raises a myriad of security and ethical concerns, experts said.

“I don’t feel like our world is ready for easy-to-use voice cloning technology,” Rachel Tobak, CEO of San Francisco-based SocialProof Security, told The Washington Post. She added that such technology could be used to manipulate the public through fake audio or video clips.

“If cybercriminals can easily and reliably replicate someone else’s voice using a small voice sample, they can use the voice sample to impersonate other individuals,” added Tupac, a cybersecurity expert. “This bad actor can then trick others into thinking they are the person they are impersonating, which can lead to fraud, data loss, account takeover, and more.”

See also  One UI 6.0 supports two more audio codecs for Samsung phones

There is also the danger of blurring the lines between what is human and what is mechanical, said Tama Lever, professor of Internet studies at Curtin University in Australia.

“You won’t remember talking to the depths of Amazon… and Its data collection services If he is talking to your grandmother, or the voice of your grandfather, or the voice of a loved one.”

“It’s a bit like an episode of Black Mirror,” Leaver said, noting. A science fiction series that depicts a future under the title of technology.

A Google engineer who believes that the company’s artificial intelligence has been achieved

Leaver added that the new Alexa feature also raises questions about consent — especially for people who never imagined their voice would be carried by an automated personal assistant after they died.

“There is a real slippery slope of using deceased people’s data in a way that is frightening on the one hand, but very unethical on the other because they never thought of using these effects in this way,” Leaver said.

Having recently lost his grandfather, Lever said he sympathized with the “temptation” of wanting to hear a loved one’s voice. But, he said, the possibility opens a door of consequences that society may not be willing to bear – for example, Who has the right to the little snippets people leave for World Wide Web influencers?

If my grandfather sent me 100 messages, do I have the right to put that into the system? And if so, who owns it? Does Amazon have that recording then? ‘ he asked. ‘Have you given up the right to my grandfather’s voice?’

See also  Dell has put 120Hz panels in its latest UltraSharp monitors

Prasad did not elaborate on such details during Wednesday’s speech. However, he posited that the ability to imitate sounds was a product of “unquestionably living in the golden age of artificial intelligence, where our dreams and science fiction became reality.”

This AI model attempts to recreate the mind of Ruth Bader Ginsburg

If Amazon’s demo becomes a real feature, Leaver said people may need to start thinking about how their voices and likeness will be used when they die.

“Should I consider in my will that I need to say, ‘My voice and my depicted history on social media is the property of my children, and they can decide whether or not they want to revive that in chat with me? Lever asked.

“That’s a strange thing to say now. But it’s probably a question we need to have an answer to before Alexa starts talking like me tomorrow.”

Leave a Reply

Your email address will not be published. Required fields are marked *