Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Saturday July 27 2019, @01:49PM   Printer-friendly
from the I-heard-what-you-did-last-summer dept.

https://arstechnica.com/gadgets/2019/07/siri-records-fights-doctors-appointments-and-sex-and-contractors-hear-it/

Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy. Apple's Siri is the latest to enter this gray space of tech. This week, The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements may be hearing personal conversations.

One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations. The wake word is the phrase "hey Siri," but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper. They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the source said. "These recordings are accompanied by user data showing location, contact details, and app data."

Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors. The audio is not linked to an Apple ID and less than 1% of daily Siri activations are reviewed. It also sets confidentiality requirements for those contract workers. We reached out to Apple for further comment and will update the story if we receive it.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Saturday July 27 2019, @08:53PM

    by Anonymous Coward on Saturday July 27 2019, @08:53PM (#872086)

    The problem is that this is built into phones, tablets and I believe laptops now. The cameras and mics need some way of shutting them off in a way that can't be accidentally enabled or enabled by 3rd parties.