Due to human error, a German user of Amazon Alexa received access to more than one thousand voice recordings created by someone else. When the man asked Alexa about his own activities, he received a link to more than 1,700 audio files that were not his.
Amazon has already confirmed the incident and has determined it to be the only known case of this happening so far. "We resolved the issue with the two customers involved and took measures to further optimize our processes. As a precautionary measure we contacted the relevant authorities," said an Amazon spokesperson.
For all the paranoid people out there that do not want digital assistants in their home over privacy concerns, this is a standout case where there is excellent reasoning. Amazon did quickly remove access to the private user data after it was reported, but copies of the information still exist. From the downloaded recordings, a magazine was able to identify and contact a pair of individuals.
Human error was cited as the cause of the issue, but it is unsettling to know that anyone could have gained access to a variety of personal information and private conversations. Fortunately it appears that neither party acted in any malicious manner, but the story may not have had such an amicable ending.
End users of digital assistants should be aware that anything said in the vicinity of their devices can and will get uploaded to remote servers when they are active. Trusting the service provider to protect or destroy all of that information in a timely manner is all that can be done to prevent personal information from being shared.
To delete any history, open the Alexa app on a smartphone and go to Settings>Alexa Account>History where any or all queries can be removed. It may be easier to clear all history via a browser by navigating to the Alexa Privacy page.