Genmar IT director James Moore writes for the Indie…
In a bizarre demo at Amazon’s annual MARS (Machine learning, Automation, Robots, Space) conference in Las Vegas in June, the company showed off a disturbing new feature of its virtual assistant Alexa – a young boy hearing a bedtime story read to him in the voice of his dead grandmother.
The feature was meant to showcase Alexa’s “human attributes”, says head Alexa scientist Rohit Prasad.
Prasad goes on to say that while AI (artificial intelligence) can’t eliminate “that pain of loss”, it can carry on the memories of the deceased.
Amazon claims its AI will be able to imitate someone’s voice after listening to just a minute of their recorded voice.
This prompted many people to cast similarities to the episode of TV series Black Mirror from 2013 Be Right Back, about a young woman, Martha, whose boyfriend, Ash, is killed in a car accident.
As she mourns him, she discovers technology enables her to communicate with an artificial intelligence imitating Ash. Using all of his past online communications and social media profiles, the service creates a virtual “Ash”.
Starting out with instant messaging, Martha uploads more videos and photos and begins to talk with the artificial Ash over the phone and eventually an entire synthetic AI robot.
So, while the technology being displayed is impressive and could no doubt provide comfort for some who have lost loved ones, it raises the question: where will this lead?
In a world where advancements in deepfake technology are making it harder to discern between real and fake video content, it’s not a huge leap in imagination to say creating new video content of your deceased relatives will soon be possible too.
Just like in the demo from Alexa, deepfakes are made by collecting video/audio samples and using an AI algorithm to learn the patterns and characteristics of that person.
In recent years this has become a new cyber-security threat as criminals seek to exploit this technology to perform sophisticated identity theft attacks in an attempt to steal money or gain access to private data.
To find out more about Genmar, go to genmar.co.uk.