Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday January 26 2017, @05:43AM   Printer-friendly
from the Dear-aunt,-let's-set-so-double-the-killer-delete-select-all dept.

Interesting story at vocative.com

Demonic sounds are usually related to evil spirits, but researchers have found a way to turn them into "hidden voice commands" for Android devices.

A group of Ph.D. candidates at Georgetown and University of California, Berkeley developed a series of voice commands that can be recognized and executed by smartphone virtual assistants, but not very easily by human ears.

Some of the things these hidden commands can potentially do include sending a tweet, making a phone call, or even using Venmo to transfer money. Or, in a cyberattack scenario, a hidden command could open a website that automatically downloads malware, which then leads to hackers having full control of your device.

[...] A similar situation occurred earlier this month when a child accidentally ordered a $150 doll house from Amazon by simply asking Amazon's Alexa, "Can you play dollhouse with me and get me a dollhouse?"

Those assistants need to recognize who is speaking...


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Anonymous Coward on Thursday January 26 2017, @06:11AM

    by Anonymous Coward on Thursday January 26 2017, @06:11AM (#458835)

    Those assistants need to recognize who is speaking...

    Wrong! I want my phone to know LESS about me, not more. We are racing headfirst into the next era of insecurity without first solving the problems we already have with it. FFS, you shouldn't want this...

    Starting Score:    0  points
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Thursday January 26 2017, @06:16AM

    by Anonymous Coward on Thursday January 26 2017, @06:16AM (#458837)

    If I had to choose between giving up some personal info or having my kid accidentally order a $150 doll house from Amazon....

    • (Score: 4, Insightful) by fido_dogstoyevsky on Thursday January 26 2017, @10:04AM

      by fido_dogstoyevsky (131) <axehandleNO@SPAMgmail.com> on Thursday January 26 2017, @10:04AM (#458865)

      If I had to choose between giving up some personal info or having my kid accidentally order a $150 doll house from Amazon....

      You honestly don't see BOTH of those as harmful?

      --
      It's NOT a conspiracy... it's a plot.
    • (Score: 1, Interesting) by Anonymous Coward on Thursday January 26 2017, @03:21PM

      by Anonymous Coward on Thursday January 26 2017, @03:21PM (#458956)

      It's probably a moot point as you just report the charge as the unauthorized it is to the credit card issuer if Amazon refuses to pay for the return shipping. Despite whatever bullshit they might have in their terms of service, saying something in the presence of one of these devices does not represent a contractual obligation as evidenced by the doll house incident. A child that age cannot enter into a legal contract in any state I know of and even if they could, they're not generally authorized for using their parents' credit cards. At absolute bare minimum, Alexa should require affirmative confirmation before shipping things out to determine that there was the intent and consent to order something.

      As it stands now, my guess is that this will either last until the chargebacks start rolling in or they can convince congress to drastically lower the standards for forming contracts.

      But, IANAL and this is going to be interesting to see play out.

  • (Score: 2) by sjames on Thursday January 26 2017, @06:53AM

    by sjames (2882) on Thursday January 26 2017, @06:53AM (#458838) Journal

    The problem isn't what the phone knows about you. it's that it tattles.

    Ideally, it should only answer to you for many of it's functions (for example, ordering stuff on your credit card). If you allow 'guests' to use it, it should know to use safe search when a child asks a question.

    It should know where you work and that when it's there, it should avoid anything NSFW.

    • (Score: 2) by maxwell demon on Thursday January 26 2017, @07:59AM

      by maxwell demon (1608) on Thursday January 26 2017, @07:59AM (#458850) Journal

      Also, if you order something on the web site, you have to first log in with your password, and you are clearly presented with the fact that you are about to order something (maybe the latter is not the case if you enable one-click shopping, but then, you are not required to enable that). The same ought to be true for Alexa whenever it understands anything as order. For example the dollhouse incident should have run as follows:

      Alternative one: The kid is aware that it could, but should not buy stuff through Alexa, and acts accordingly.

      Child: Can you play dollhouse with me and get me a dollhouse?
      Alexa: I can order you a dollhouse for 150 dollars. Do you want me to do that?
      Child: No.
      Alexa: OK. Nothing ordered.

      Alternative two: The kid doesn't know, or doesn't care, that it should not order stuff:

      Child: Can you play dollhouse with me and get me a dollhouse?
      Alexa: I can order you a dollhouse for 150 dollars. Do you want me to do that?
      Child: Yes.
      Alexa: OK. Please tell me your name.
      Child: Melinda.
      Alexa: Sorry, you are not in the list of persons allowed to order stuff.

      Alternative three: The kid acually tries to circumvent security, but the parents are careful that she doesn't ever get to hear the password.

      Child: Can you play dollhouse with me and get me a dollhouse?
      Alexa: I can order you a dollhouse for 150 dollars. Do you want me to do that?
      Child: Yes.
      Alexa: OK. Please tell me your name.
      Child, using the mother's name: Alice
      Alexa: Alice, please tell me your password.
      Child: Uh … I have no idea.
      Alexa: Wrong password.

      Note that at this point, you already cannot speak about an accidental buy, even if it had succeeded.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 5, Funny) by Webweasel on Thursday January 26 2017, @01:19PM

        by Webweasel (567) on Thursday January 26 2017, @01:19PM (#458898) Homepage Journal

        Homer Simpson: Hello, My name is Mr Burns. I believe you have a letter for me.
        Postman: OK Mr Burns, what's your first name?
        Homer Simpson: I don't know.

        --
        Priyom.org Number stations, Russian Military radio. "You are a bad, bad man. Do you have any other virtues?"-Runaway1956
        • (Score: 3, Funny) by Thexalon on Thursday January 26 2017, @02:31PM

          by Thexalon (636) on Thursday January 26 2017, @02:31PM (#458924)

          School Secretary: "So, you're saying Mary is sick, and can't come to school today?"
          Voice on Phone: "Yes."
          School Secretary: "To whom am I speaking?"
          Voice on Phone: "This is my mother."

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 0) by Anonymous Coward on Thursday January 26 2017, @03:25PM

        by Anonymous Coward on Thursday January 26 2017, @03:25PM (#458958)

        The responsibility here isn't on the parents, the responsibility here is on Amazon for shipping a product that they know can be used like this. This isn't that much different from when Google got smacked for allowing children to buy apps without parental approval.

        That strategy might work out for older kids, but what about children that are old enough to talk, but not necessarily understand not to say those things around the device? And what about cases where your kid has friends over? Or where the TV is on?

        This technology is a bad idea and is designed as a cynical ploy to further undermine people's effort to maintain a budget. It's not that much different from that stupid one click or dash buttons where they get to skip steps that would normally allow for people to change their mind. Granted, in both cases they're opt-in and require at least minimal intent, but they're still aimed at making it easy for people to buy things they don't really want.

    • (Score: 2) by mcgrew on Thursday January 26 2017, @10:26PM

      by mcgrew (701) <publish@mcgrewbooks.com> on Thursday January 26 2017, @10:26PM (#459183) Homepage Journal

      I'm paranoid and NEVER shop on my phone. Any e-shopping is on my laptop, at home on my own network. I had too many phones lost and stolen, and don't trust Google.

      --
      mcgrewbooks.com mcgrew.info nooze.org
      • (Score: 2) by sjames on Thursday January 26 2017, @11:04PM

        by sjames (2882) on Thursday January 26 2017, @11:04PM (#459198) Journal

        Since the currently do tattle and aren't all that tightly secured, that's a good idea.

  • (Score: 2) by mcgrew on Thursday January 26 2017, @10:21PM

    by mcgrew (701) <publish@mcgrewbooks.com> on Thursday January 26 2017, @10:21PM (#459181) Homepage Journal

    Funny, this trick wouldn't work on my Kyocera phone or my Samsung tablet; you have to tap the microphone icon before it listens. I wouldn't buy a device that was listening all the time.

    --
    mcgrewbooks.com mcgrew.info nooze.org