Stories
Slash Boxes
Comments

SoylentNews is people

posted by FatPhil on Sunday July 21 2019, @06:44PM   Printer-friendly
from the Big-Brother-Is-Listening dept.

Google has confirmed that audit snippets are sent to humans to listen to which includes background noises following an investigation about Dutch audio data that had been leaked.

Google said this work helps with developing voice recognition and other technology in its Google Assistant artificial intelligence system, which is used in its Google Home smart speakers and Android smartphones.

[...] Approximately 0.2 per cent of all audio snippets are reviewed by "language experts". Google's response to the leak of private user data is to review their safeguards to prevent future misconduct.

We just learned that one of these reviewers has violated our data security policies by leaking confidential Dutch audio data. [...]

Belgium's VRT NWS also report: Google employees are eavesdropping, even in your living room, VRT NWS has discovered.

Google employees are systematically listening to audio files recorded by Google Home smart speakers and the Google Assistant smartphone app. Throughout the world – so also in Belgium and the Netherlands – people at Google listen to these audio files to improve Google's search engine. VRT NWS was able to listen to more than a thousand recordings. Most of these recordings were made consciously, but Google also listens to conversations that should never have been recorded, some of which contain sensitive information.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Insightful) by fustakrakich on Sunday July 21 2019, @06:52PM (11 children)

    by fustakrakich (6150) on Sunday July 21 2019, @06:52PM (#869701) Journal

    How many times has it been posted? The mic is always hot. Of course the guy who says it is the lunatic mad hatter.

    So, will the reaction produce real results? How will we ever know?

    --
    La politica e i criminali sono la stessa cosa..
    • (Score: 3, Insightful) by darkfeline on Sunday July 21 2019, @10:08PM (10 children)

      by darkfeline (1030) on Sunday July 21 2019, @10:08PM (#869742) Homepage

      Maybe people are surprised due to all of the blatant misinformation, such as your post. On none of the most common "smart" devices are the mic always hot. This can be trivially demonstrated by packet sniffing and to date there has been zero evidence of this.

      1. These devices only record when triggered.
      1a. These devices may be triggered unintentionally. Obviously. We haven't invented mind reading technology yet.
      2. Recordings are used for training the backend AI.
      2a. This is covered in the ToS (but who reads those?)
      3. Someone with access to the data leaked them.

      There's only a handful of rational reactions to this very old news:

      1. You are against "smart" devices at a philosophical level. Which is fine, but there's nothing to discuss in that case.
      2. Google et al are at fault for keeping the recordings (but how are they supposed to improve their AI?).
      3. Google et al are at fault for not instituting stronger access controls.
      4. The leaker is at fault for intentionally leaking user private data. (Imagine if a "whistleblower" "leaked" private health records to prove some third party kept health records.)

      --
      Join the SDF Public Access UNIX System today!
      • (Score: 0) by Anonymous Coward on Sunday July 21 2019, @10:53PM

        by Anonymous Coward on Sunday July 21 2019, @10:53PM (#869756)

        At least let your family know that what they say in their home may not be "private".

      • (Score: 2) by Runaway1956 on Monday July 22 2019, @12:03AM

        by Runaway1956 (2926) Subscriber Badge on Monday July 22 2019, @12:03AM (#869773) Journal

        There's only a handful of rational reactions to this very old news:

        1. You are against "smart" devices at a philosophical level. Which is fine, but there's nothing to discuss in that case.
        2. Google et al are at fault for keeping the recordings (but how are they supposed to improve their AI?).
        3. Google et al are at fault for not instituting stronger access controls.
        4. The leaker is at fault for intentionally leaking user private data. (Imagine if a "whistleblower" "leaked" private health records to prove some third party kept health records.)

        You forgot number 5.

        5. All of the above.

      • (Score: 1) by fustakrakich on Monday July 22 2019, @12:16AM

        by fustakrakich (6150) on Monday July 22 2019, @12:16AM (#869776) Journal

        :-) Yes, of course... Nothing to see here...

        Basically, I'm only against spying, on the "philosophical" level. The timid reaction, and even denial is more disturbing.

        --
        La politica e i criminali sono la stessa cosa..
      • (Score: 1, Insightful) by Anonymous Coward on Monday July 22 2019, @01:38AM

        by Anonymous Coward on Monday July 22 2019, @01:38AM (#869792)

        2. Google et al are at fault for keeping the recordings (but how are they supposed to improve their AI?).

        How about they hire testers? Then the testers can earn money at home by just talking to the robot. All these recordings will be available for review, since Google paid for them, and nobody would be worried too much.

      • (Score: 1) by Coward, Anonymous on Monday July 22 2019, @05:05AM (3 children)

        by Coward, Anonymous (7017) on Monday July 22 2019, @05:05AM (#869839) Journal

        How about:
        5. Google at al are aiding and abetting illegal voice recordings. In many places (11 US states [dmlp.org]), it's illegal to record people without their consent, and I'm sure those Google microphones listen in on a lot of people who didn't give consent.

        • (Score: 2) by darkfeline on Monday July 22 2019, @08:30AM (2 children)

          by darkfeline (1030) on Monday July 22 2019, @08:30AM (#869864) Homepage

          That's the responsibility of the device owner. Should microphone manufacturers be liable if someone uses a purchased microphone to record voices illegally? The owner is responsible for informing any affected parties of the recording device. It's the same with security cameras in many places.

          The owner of the device (should have) read the ToS and thus has given consent by using the device after purchase instead of returning the product and requesting a refund.

          --
          Join the SDF Public Access UNIX System today!
          • (Score: 2) by Coward, Anonymous on Monday July 22 2019, @03:48PM

            by Coward, Anonymous (7017) on Monday July 22 2019, @03:48PM (#869988) Journal

            And what if the device malfunctions and triggers, even though nobody said the keyword? If a law is violated due to a device's design flaw, the manufacturer will be on the hook.

          • (Score: 0) by Anonymous Coward on Monday July 22 2019, @05:23PM

            by Anonymous Coward on Monday July 22 2019, @05:23PM (#870011)

            Should microphone manufacturers be liable if someone uses a purchased microphone to record voices illegally?

            A microphone doesn't record. The recording equipment it is connected to records. If you set up recording equipment to illegally record something, you are of course liable. The microphone supplier did nothing of that kind.

      • (Score: 0) by Anonymous Coward on Monday July 22 2019, @06:40AM

        by Anonymous Coward on Monday July 22 2019, @06:40AM (#869850)

        2. Recordings are used for training the backend AI.

        I like how this line does not contain the word "only".

      • (Score: 0) by Anonymous Coward on Monday July 22 2019, @03:45PM

        by Anonymous Coward on Monday July 22 2019, @03:45PM (#869985)

        2a. This is covered in the ToS (but who reads those?)

        That's why according to the GDPR permission for processing personal data as part of the ToS doesn't count. Permission has to be obtained separately. Remember this is about recordings of EU citizens, the GDPR applies.

  • (Score: 2) by JoeMerchant on Sunday July 21 2019, @07:11PM (2 children)

    by JoeMerchant (3937) on Sunday July 21 2019, @07:11PM (#869702)

    Married, with kids, for 22 years. If audio recordings of our bedroom have any value whatsoever...

    --
    🌻🌻 [google.com]
    • (Score: 1, Informative) by Anonymous Coward on Sunday July 21 2019, @10:52PM

      by Anonymous Coward on Sunday July 21 2019, @10:52PM (#869755)

      "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged." ~ Cardinal Richelieu

    • (Score: 1, Troll) by Ethanol-fueled on Monday July 22 2019, @04:19AM

      by Ethanol-fueled (2792) on Monday July 22 2019, @04:19AM (#869830) Homepage

      Your wife has Black men over while you work. Google recordings of your bedroom while you are not around sound like a 2 Live Crew Album.

  • (Score: 1) by Coward, Anonymous on Monday July 22 2019, @05:16AM

    by Coward, Anonymous (7017) on Monday July 22 2019, @05:16AM (#869840) Journal

    Was watching a show the other day where a guy said "OK (something that wan't Google)". That triggered OK Google on my phone, which I thought I had disabled, but must have accidentally turned on from one of the endless nag screens that pop up, trying to convince people to enable Google Ass(istant). I disabled it (Ass and Voice) properly today.

  • (Score: 0) by Anonymous Coward on Monday July 22 2019, @02:43PM

    by Anonymous Coward on Monday July 22 2019, @02:43PM (#869965)

    someday soon someone is going to invent the "ok google programming" language where ~5 "ok google enabled" smart phones with the right "names" in the address book are going to talk to themselfs ... maybe forming the nexus of the true A.I. that will metdown the internet because all the "assistances" are calling and talking to each other ^_^

(1)