Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy. Apple's Siri is the latest to enter this gray space of tech. This week, The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements may be hearing personal conversations.
One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations. The wake word is the phrase "hey Siri," but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper. They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.
"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the source said. "These recordings are accompanied by user data showing location, contact details, and app data."
Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors. The audio is not linked to an Apple ID and less than 1% of daily Siri activations are reviewed. It also sets confidentiality requirements for those contract workers. We reached out to Apple for further comment and will update the story if we receive it.
(Score: 4, Funny) by AthanasiusKircher on Saturday July 27 2019, @03:00PM (2 children)
Not news to me that devices with cameras and microphones are listening in on people even when they're not "supposed to be."
BUT... from TFA:
Wow... uh, the "noise of a zipper"? That's pretty specific. And honestly more than a little creepy given the headline about Siri listening in on sex. I can just imagine the developer conversation:
"Gosh George, that's really good work with the hidden 'zipper activation' setting. You still need some tweaking to get Siri to properly detect the 'unhook bra strap' noise, though. We need that because the 'slurping and sucking' noise detection function is picking up too many false positives. Nobody wants to listen to people eat...."
(Score: 2) by Runaway1956 on Saturday July 27 2019, @05:30PM
Months later,
"George, how do we teach Siri to turn on at the sound of a zipper, but not at the sound of velcro being torn apart? NO ONE wants to listen to people taking their shoes off after a hard day at school/work!"
(Score: 0) by Anonymous Coward on Monday July 29 2019, @02:12PM
en yet there's entire subsection of youtube with exactly that, and people making a living at it