Alexa: Bedtime story from the afterlife
Amazon introduced a capability for Alexa that supports deepfakes at the Remars conference. A departed grandmother, for example, may narrate a bedtime tale to her grandchildren using an Alexa device.
Alexa, Amazon's virtual assistant, has been available since 2014. In addition to the Echo series' smart speakers with microphones, there are various devices with displays that have a digital assistant.
Amazon introduced a new tool that can record brief audio recordings at the Remars conference. These conversational fragments can then be transformed into audio output. A deepfake featuring the voice of a deceased person, for example, is achievable.
According to Amazon's Senior Vice President and Head Scientist, Rohit Prasad, this function allows a departed grandmother, for example, to tell a bedtime tale to her grandchildren via Alexa.
To be able to use the function, an Alexa device requires a one-minute voice recording of the person. It is unknown when the new capability will be available to all Alexa users.
For some time, all Alexa devices have had greater privacy options. As a result, voice recordings can be destroyed automatically after Alexa has carried out the voice order.
Assuming the user has selected the relevant option in the Alexa data protection settings. Text transcripts of Alexa inquiries can also be viewed and erased online at any time using the Alexa data protection portal.
Amazon claims that text transcripts of voice requests will be deleted automatically after 30 days. You can also manually remove your voice recordings at any time by saying "Alexa, delete everything I said."
Post by: Bryan C.