Alexa transcripts may persist even after deleted, Amazon reveals in letter to Senator
Transcripts may remain, even when "deleted"By Shawn Knight
Bottom line: When you invite a device with a microphone into your home - be it a speaker, video camera or a smartphone - your reasonable expectation of privacy essentially goes out the window.
Delaware Senator Chris Coons sent a letter to Amazon CEO Jeff Bezos in May with regard to how the company handles Alexa voice transcripts.
Among his many inquiries, Coons wanted to know how long Amazon holds on to transcripts of user voice recordings, if users can delete the transcripts, what Amazon uses the transcripts for and if Amazon takes any measures to anonymize user identity related to transcripts.
Late last week, Coons received a reply from Brian Huseman, vice president of public policy at Amazon, and the responses are rather alarming. Huseman said customers can review, listen to and delete voice recordings associated with their account and can delete individual recordings, those from a particular timeframe or all of their recordings at their convenience. This implies that Amazon otherwise keeps recordings indefinitely.
When a customer deletes a voice recording, we delete the transcripts associated with the customer's account of both of the customer's request and Alexa's response. We already delete those transcripts from all of Alexa's primary storage systems, and we have an ongoing effort to ensure those transcripts do not remain in any of Alexa's other storage systems.
This statement suggests that even when attempting to delete a voice recording, a copy of the transcript may still exist somewhere in Alexa's other storage systems.
Huseman said Amazon may also store other records of customers' Alexa interactions, such as when interacting with an Alexa skill. Similarly, developers may also retain records of these interactions.
For example, for many types of Alexa requests - such as when a customer subscribes to Amazon Music Unlimited, places an Amazon Fresh order, requests a car from Uber or Lyft, orders a pizza from Domino's, or makes an in-skill purchase of premium digital content - Amazon and/or the applicable skill developer obviously need to keep a record of the transaction.
Such is also the case for other types of requests, Huseman said, like setting a recurring alarm or asking Alexa to remind you of an anniversary. "Customers would not want or expect deletion of the voice recording to delete the underlying data or prevent Alexa from performing the requested task," he added.
Amazon said it uses transcripts and voice recordings to help train Alexa to better understand customer responses and to provide transparency with regard to what Alexa thought it heard and what it provided as a response.
Due to the ability for customers to review interactions through the Voice History feature, transcripts must be associated with a customer's account and thus, aren't anonymized.
In a statement to CNET, the lawmaker said Amazon's response leaves open the possibility that transcripts of user voice interactions with Alexa are not deleted from all of Amazon's servers, even after a user has deleted a recording of his or her voice. "What's more, the extent to which this data is shared with third parties, and how those third parties use and control that information, is still unclear," Coons added.
Masthead credit: Amazon Echo Dot by Charles Brutlag. Transcription image by Artur Szczbylo