Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Saturday July 27 2019, @01:49PM   Printer-friendly
from the I-heard-what-you-did-last-summer dept.

https://arstechnica.com/gadgets/2019/07/siri-records-fights-doctors-appointments-and-sex-and-contractors-hear-it/

Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy. Apple's Siri is the latest to enter this gray space of tech. This week, The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements may be hearing personal conversations.

One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations. The wake word is the phrase "hey Siri," but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper. They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the source said. "These recordings are accompanied by user data showing location, contact details, and app data."

Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors. The audio is not linked to an Apple ID and less than 1% of daily Siri activations are reviewed. It also sets confidentiality requirements for those contract workers. We reached out to Apple for further comment and will update the story if we receive it.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Funny) by AthanasiusKircher on Saturday July 27 2019, @03:00PM (2 children)

    by AthanasiusKircher (5291) on Saturday July 27 2019, @03:00PM (#871941) Journal

    There is something seriously wrong if this is news to anyone.

    Not news to me that devices with cameras and microphones are listening in on people even when they're not "supposed to be."

    BUT... from TFA:

    ...Siri did sometimes record audio after mistaken activations. The wake word is the phrase "hey Siri," but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper

    Wow... uh, the "noise of a zipper"? That's pretty specific. And honestly more than a little creepy given the headline about Siri listening in on sex. I can just imagine the developer conversation:

    "Gosh George, that's really good work with the hidden 'zipper activation' setting. You still need some tweaking to get Siri to properly detect the 'unhook bra strap' noise, though. We need that because the 'slurping and sucking' noise detection function is picking up too many false positives. Nobody wants to listen to people eat...."

    Starting Score:    1  point
    Moderation   +2  
       Funny=2, Total=2
    Extra 'Funny' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by Runaway1956 on Saturday July 27 2019, @05:30PM

    by Runaway1956 (2926) Subscriber Badge on Saturday July 27 2019, @05:30PM (#872024) Journal

    Months later,

    "George, how do we teach Siri to turn on at the sound of a zipper, but not at the sound of velcro being torn apart? NO ONE wants to listen to people taking their shoes off after a hard day at school/work!"

  • (Score: 0) by Anonymous Coward on Monday July 29 2019, @02:12PM

    by Anonymous Coward on Monday July 29 2019, @02:12PM (#872632)

    Nobody wants to listen to people eat...."

    en yet there's entire subsection of youtube with exactly that, and people making a living at it