Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Friday April 12 2019, @05:05AM   Printer-friendly
from the ceiling-cat-is-watching-you-masturbate dept.

Smart speaker recordings reviewed by humans

Amazon, Apple and Google all employ staff who listen to customer voice recordings from their smart speakers and voice assistant apps.

News site Bloomberg highlighted the topic after speaking to Amazon staff who "reviewed" Alexa recordings.

All three companies say voice recordings are occasionally reviewed by humans to improve speech recognition.

But the reaction to the Bloomberg article suggests many customers are unaware that humans may be listening.

The news site said it had spoken to seven people who reviewed audio from Amazon Echo smart speakers and the Alexa service.

Reviewers typically transcribed and annotated voice clips to help improve Amazon's speech recognition systems.

Amazon's voice recordings are associated with an account number, the customer's first name and the serial number of the Echo device used.

Some of the reviewers told Bloomberg that they shared amusing voice clips with one another in an internal chat room.

They also described hearing distressing clips such as a potential sexual assault. However, they were told by colleagues that it was not Amazon's job to intervene.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Informative) by Arik on Friday April 12 2019, @05:13AM (1 child)

    by Arik (4543) on Friday April 12 2019, @05:13AM (#828488) Journal
    We're working on a way to monetize such interventions in the future.
    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 2) by Booga1 on Friday April 12 2019, @05:29AM

      by Booga1 (6333) on Friday April 12 2019, @05:29AM (#828492)

      I'd make a Clippy or OnStar joke here, but really feel like it would be insensitive.
      However, you may be correct in that they may decide to offer home or office security services as a paid package at some point.
      Amazon Echo can already tie into your home's lights, sensors, cameras, and they know your Amazon product delivery schedules. They might notice a package being stolen off your porch or something and notify you if the package they just dropped off was just taken away.

  • (Score: 2) by Barenflimski on Friday April 12 2019, @06:02AM (6 children)

    by Barenflimski (6836) on Friday April 12 2019, @06:02AM (#828508)

    Amazon's explanation, that they need to train the systems to do what people ask, is fairly reasonable. How do you make the stuff work if you don't model human speech? Is it really reasonable that we ask that this stuff work by only sampling other people's anonymized information?

    • (Score: 4, Insightful) by bob_super on Friday April 12 2019, @06:36AM (4 children)

      by bob_super (1357) on Friday April 12 2019, @06:36AM (#828515)

      Reviewing a data set of somewhat recent anonymized information is the right way.

      This, on the other hand:

      They also described hearing distressing clips such as a potential sexual assault. However, they were told by colleagues that it was not Amazon's job to intervene.

      Clearly implies that they were reviewing, potentially in real-time, something that was clearly traceable back to a source. ("Not my job" isn't "can't do it")

      That is way beyond what "training" requires, and opens the door to a whole lot of abuse possibilities.

      • (Score: 0) by Anonymous Coward on Friday April 12 2019, @07:43AM (2 children)

        by Anonymous Coward on Friday April 12 2019, @07:43AM (#828522)

        Surely that makes them guilty of protecting offenders, possibly permitting the violence to continue, and hence potentially conspiracy to murder in some cases?

        And for the financial benefit of Amazon - who obviously did not want people to know they had invited "ms snoop" into their home. Alexa - the ultimate neighbourhood gossip!

        • (Score: 3, Insightful) by Anonymous Coward on Friday April 12 2019, @12:58PM

          by Anonymous Coward on Friday April 12 2019, @12:58PM (#828570)

          > Surely that makes them guilty of protecting offenders, possibly permitting the violence to continue, and hence potentially conspiracy to murder in some cases?

          How would the "listeners" distinguish between an actual assault vs. a domination session (possibly with a "safe word" that was agreed on at another time/location)? Forwarding this to the cops could easily turn into a swatting episode.

          And, if Amazon (or whoever) became known for interventions like this, how long before some bad actor figures out how to swat their enemy...pretending their call to the SWAT team is coming from Amazon?

        • (Score: 0) by Anonymous Coward on Saturday April 13 2019, @12:00AM

          by Anonymous Coward on Saturday April 13 2019, @12:00AM (#828797)

          Or they *gasp!* were engaging in some BDSM or even acting out a rape fantasy.

          But of course, strip *ahem* everyone of their rights first and worry about making everyone else submit second. Err.. the saying goes something like that, doesn't it?

          ---

          Do you know what's happening? no? then stfu and leave me alone. If I did so terribly I end up dead the police can subpoena the records to see how I got myself into that situation.

      • (Score: 0) by Anonymous Coward on Friday April 12 2019, @07:54AM

        by Anonymous Coward on Friday April 12 2019, @07:54AM (#828524)

        "Intervening" in this case may also mean filing a police report even though the suspected victim won't. So it's not necessarily real-time.

    • (Score: 2, Touché) by Anonymous Coward on Friday April 12 2019, @10:55AM

      by Anonymous Coward on Friday April 12 2019, @10:55AM (#828544)

      its not surprising that people listen to the recordings
      but it would be surprising if everyone listening was paid by amazon.

  • (Score: 1, Touché) by Anonymous Coward on Friday April 12 2019, @06:27AM (2 children)

    by Anonymous Coward on Friday April 12 2019, @06:27AM (#828514)

    The only way to stop this is voting en mass for real leaders who will raise taxes, come up with excuses to spy on us, and ban more shit until it gets better. Arise comrades, to the front.

    • (Score: 3, Insightful) by DeathMonkey on Friday April 12 2019, @05:47PM (1 child)

      by DeathMonkey (1380) on Friday April 12 2019, @05:47PM (#828685) Journal

      Seems a bit easier to just not purchase one of these devices. A bit cheaper, too.

      • (Score: 1, Insightful) by Anonymous Coward on Friday April 12 2019, @10:46PM

        by Anonymous Coward on Friday April 12 2019, @10:46PM (#828774)

        Also, to spead word to others, so they also do not purchase.

  • (Score: 4, Insightful) by gtomorrow on Friday April 12 2019, @07:59AM (3 children)

    by gtomorrow (2230) on Friday April 12 2019, @07:59AM (#828525)

    All three companies say voice recordings are occasionally reviewed by humans to improve speech recognition.

    This is nonsense. This unholy trinity* certainly has the resources to sample people's voices planet-wide out in the open, knowingly, not covertly as has been discovered. Google sends its Googlemobile around the world shooting pictures for their Street View! What would it take to code up a "Would you like to participate in a voice-recording survey?" page on their respective sites? NOTHING! This is clearly them (all three) using their own customers as paying guinea pigs, with bonus points for further desensitizing the masses to yet another privacy invasion. "Privacy? That's so passè!" Extra bonus points for providing entertainment for their voice-analyzer flying monkeys.

    And yet, it will continue...

    *hypocritical as I use all three's products...for better or worse.

    • (Score: 0) by Anonymous Coward on Friday April 12 2019, @03:13PM

      by Anonymous Coward on Friday April 12 2019, @03:13PM (#828627)

      Google only charges for their captive audiences when they know those people won't rebel because they are stupid. Look at the youtube subscription model price increases. People who "cut the cable" only to deliver the same content over the same cable are experiencing the same price increases--new boss is the same as the old boss, and Google is happy to make people think they are being rebelious by "cutting cable" while delivering the same TV over it...

      Everything else is forced or free, for those people not so inclined to pay for the same stuff again. Street view's photography and one's forced participation is both captive audience and forced participation-- no one is a paying guinea pigs to get documented without an easy opt out--forced to train as a pig, yes, but no one shells out money for the betas.

      Somehow, this makes it ok for many people and the government, too.

    • (Score: 3, Interesting) by DeathMonkey on Friday April 12 2019, @05:50PM (1 child)

      by DeathMonkey (1380) on Friday April 12 2019, @05:50PM (#828687) Journal

      Seems like they could accomplish the same task by simply using anonymized data and a lot of us would be OK with it. Adding tricky bits to the training set is really the only way you can improve the results.

      • (Score: 3, Interesting) by gtomorrow on Saturday April 13 2019, @07:37AM

        by gtomorrow (2230) on Saturday April 13 2019, @07:37AM (#828903)

        Seems like they could accomplish the same task by simply using anonymized data and a lot of us would be OK with it. Adding tricky bits to the training set is really the only way you can improve the results.

        Sorry about the late reply (life getting in the way) but, really? Anonymized data? Is there really such a thing? Those better versed in this subject than I can verify there's no such animal.
        In second place where's the problem in asking up front, no covert anonymized-data-involved, "can we record your voice"? Just ask across "this great land of ours" to say (for example) "the cars are parked in Harvard Yard." I'm certain MILLIONS of people would happily oblige. What tricky bits could there be other than gleaning personal data? I'm sick of Google/Amazon/Add-who-you-will-to-the-list reading the newspaper over my shoulder.

  • (Score: 4, Insightful) by SomeGuy on Friday April 12 2019, @02:28PM (3 children)

    by SomeGuy (5632) on Friday April 12 2019, @02:28PM (#828608)

    And yet consumertards insist on brining these things in to their homes and letting these companies listen to every word. If you tell a consumertard they should not, they will bitch and whine about how they must have the latest toys, they have nothing to hide, and you should do the same.

    • (Score: 0) by Anonymous Coward on Friday April 12 2019, @05:19PM

      by Anonymous Coward on Friday April 12 2019, @05:19PM (#828673)

      That reminds me, I really need to give mycroft.ai a try. Apart from Google's version that's built into my phone, I don't have anything like this in my house because I have no control over what it hears. One of the things I appreciate about my Roku is that it has a specific button to turn on the microphone. But, OTOH, that should worry me as I'm not sure how that's hooked up, it could well be that it's just a software button that could be bypassed by crackers.

    • (Score: 3, Funny) by Anonymous Coward on Friday April 12 2019, @08:01PM (1 child)

      by Anonymous Coward on Friday April 12 2019, @08:01PM (#828725)

      And yet consumertards insist on brining these things in to their homes

      Those are the smart consumers, that are brining their smart speakers! Makes them last well into the early spring, and can be used as rations for sled dogs, if they are very very hungry.

      Now the Norse, however, have been soaking their smart speakers in lye, and then can be heard calling out, "Hey, Lutefisk, play NPR!"

      • (Score: 2, Touché) by Anonymous Coward on Friday April 12 2019, @10:56PM

        by Anonymous Coward on Friday April 12 2019, @10:56PM (#828778)

        And yet consumertards insist on brining these things in to their homes

        Those are the smart consumers, that are brining their smart speakers! Makes them last well into the early spring, and can be used as rations for sled dogs, if they are very very hungry.

        Now the Norse, however, have been soaking their smart speakers in lye, and then can be heard calling out, "Hey, Lutefisk, play NPR!"

        Where's my '+1 spelling nazi' mod when I need it?

  • (Score: 4, Informative) by corey on Friday April 12 2019, @09:58PM (1 child)

    by corey (2202) on Friday April 12 2019, @09:58PM (#828758)

    They also described hearing distressing clips such as a potential sexual assault. However, they were told by colleagues that it was not Amazon's job to intervene.

    Um, in Australia at least, I think this is illegal. If you are witness to a crime, you have a legal obligation to report it. Except if you're a confession Minister of course.

    Big Gray area anyway.

    • (Score: 0) by Anonymous Coward on Friday April 12 2019, @11:04PM

      by Anonymous Coward on Friday April 12 2019, @11:04PM (#828784)

      Um, in Australia at least, I think this is illegal. If you are witness to a crime, you have a legal obligation to report it. Except if you're a confession Minister of course.

      We don't have that communist garbage here in the US! If you're not armed with a gun and ready to shoot at the slightest movement, it's your fault if you're a victim!

      And if you're a victim, you're worthy only of shame and derision!

(1)