Amazon’s Alexa technology has progressed to the point where it can now mimic the voice of any person, living or dead, and it can do it with less than one minute of speech. Now the world’s largest online retailer wants to give you the ability to hear your dead loved ones’ voices through Alexa. The feature is still in development and Amazon hasn’t said when it might launch publicly, but its preview comes at a moment when the cutting-edge capabilities of artificial intelligence are under close scrutiny. Reactions on Twitter ranged from "creepy" to "morbid" to "no," as many online expressed unease at a feature that brings a voice back from the dead. Big Tech companies are increasingly studying AI's impact on society, and Microsoft recently announced that it was restricting the use of software that mimics a person's voice, saying the feature could be weaponized by those trying to impersonate speakers as an act of deception. For some people, it might actually help in the same way we look back and watch videos of our departed loved ones, but it clearly comes with serious ethical issues.
Amazon’s Alexa May Soon Be Able to Sound Like Your Dead Loved One
Amazon’s Alexa technology has progressed to the point where it can now mimic the voice of any person, living or dead, and it can do it with less than one minute of speech. Now the world’s largest online retailer wants to give you the ability to hear your dead loved ones’ voices through Alexa. The feature is still in development and Amazon hasn’t said when it might launch publicly, but its preview comes at a moment when the cutting-edge capabilities of artificial intelligence are under close scrutiny. Reactions on Twitter ranged from "creepy" to "morbid" to "no," as many online expressed unease at a feature that brings a voice back from the dead. Big Tech companies are increasingly studying AI's impact on society, and Microsoft recently announced that it was restricting the use of software that mimics a person's voice, saying the feature could be weaponized by those trying to impersonate speakers as an act of deception. For some people, it might actually help in the same way we look back and watch videos of our departed loved ones, but it clearly comes with serious ethical issues.