Facepalm: An anonymous source has revealed that quality assurance auditors regularly hear parts of personal conversations and people having sex while listening to Siri recordings. While the snippets are only a few seconds long and are stripped of identifying data, sometimes private information is revealed like health conditions.

Apple has continually taken a hard stance on user privacy. Whether it is fighting law enforcement over encrypted data or lobbying against built-in backdoors, the company has publicly shown it has users' backs when it comes to privacy.

That is not to say that Apple has not had its slip-ups. The latest comes from an anonymous source that told The Guardian that contractors working on Siri frequently hear private discussions. Some of these recordings include medical information, drug dealings, and people being intimate.

The reason is that a small number of Siri interactions are passed on to outsourced quality-control (QC) contractors. Their job is to grade these interactions on various aspects. They look at whether the activation was on purpose or on accident, whether it was a request that Siri could even handle, and whether its response was appropriate.

The recordings are "pseudonymized" (stripped of identifiable data) to protect the user's identity, but can contain request related information including app info, contacts, and locations. Apple says it only uses this information to determine what happened after the command and whether Siri's responses were appropriate.

"A small portion of Siri requests are analyzed to improve Siri and dictation," an Apple spokesperson told The Guardian. "User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities, and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."

Apple also noted that only a small "subset" of Siri interactions (less than one percent) are analyzed and these snippets are mostly less than a few seconds.

A majority of the incidental recordings originate from accidental activations. In other words, the software misheard the wake-up phrase "hey Siri" and started recording. This can occasionally happen like the time UK Defense Secretary Gavin Williamson got heckled by his own phone (above) to comedic effect.

The fact humans listen to interactions is not explicitly mentioned in its Terms of Service, which is where Apple seems to have dropped the ball. It also does not let users opt-out of sharing either. Siri (and sharing) is either on or off.

Ultimately nobody's privacy has been compromised, but Apple could have been more transparent about the QC process rather than having the public find out through a whistleblower.

Image credit: Piotr Swat via Shutterstock