Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by Fnord666 on Thursday September 07 2017, @01:46PM   Printer-friendly
from the careless-whispers dept.

Submitted via IRC for SoyCow1937

Hacks are often caused by our own stupidity, but you can blame tech companies for a new vulnerability. Researchers from China's Zheijiang University found a way to attack Siri, Alexa and other voice assistants by feeding them commands in ultrasonic frequencies. Those are too high for humans to hear, but they're perfectly audible to the microphones on your devices. With the technique, researchers could get the AI assistants to open malicious websites and even your door if you had a smart lock connected.

The relatively simple technique is called DolphinAttack. Researchers first translated human voice commands into ultrasonic frequencies (over 20,000 hz). They then simply played them back from a regular smartphone equipped with an amplifier, ultrasonic transducer and battery -- less than $3 worth of parts.

What makes the attack scary is the fact that it works on just about anything: Siri, Google Assistant, Samsung S Voice and Alexa, on devices like smartphones, iPads, MacBooks, Amazon Echo and even an Audi Q3 -- 16 devices and seven system in total. What's worse, "the inaudible voice commands can be correctly interpreted by the SR (speech recognition) systems on all the tested hardware." Suffice to say, it works even if the attacker has no device access and the owner has taken the necessary security precautions.

Source: https://www.engadget.com/2017/09/06/alexa-and-siri-are-vulnerable-to-silent-nefarious-commands/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Interesting) by melikamp on Thursday September 07 2017, @01:59PM (3 children)

    by melikamp (1886) on Thursday September 07 2017, @01:59PM (#564558) Journal

    alexa, initiate a silent self-destruct sequence!

    siri, purchase 2017 porche carrera 911 with default credit card!

    alexa, email address book to looser@mailinator.com!

    siri, install the PwnMe app from the store!

    • (Score: 5, Funny) by Weasley on Thursday September 07 2017, @03:16PM (2 children)

      by Weasley (6421) on Thursday September 07 2017, @03:16PM (#564597)

      Or the worst yet: Alexa, play Nickelback.

      • (Score: 2) by DeathMonkey on Thursday September 07 2017, @05:27PM

        by DeathMonkey (1380) on Thursday September 07 2017, @05:27PM (#564662) Journal

        Alexa, play Nickelback.

        I'm sorry Dave, I'm afraid I can't do that.

      • (Score: 4, Funny) by VLM on Thursday September 07 2017, @06:47PM

        by VLM (445) on Thursday September 07 2017, @06:47PM (#564708)

        I've had the misfortune of being rickrolled by IRL meatspace "friends" who yell "Alexa play never gonna give you up" thru the window and run like hell.

  • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @02:07PM (7 children)

    by Anonymous Coward on Thursday September 07 2017, @02:07PM (#564559)

    Hacks are often caused by our own stupidity

    Is this true?

    • (Score: 2, Funny) by Anonymous Coward on Thursday September 07 2017, @03:23PM

      by Anonymous Coward on Thursday September 07 2017, @03:23PM (#564605)

      I don't know, ask Alexa.

    • (Score: 2) by mcgrew on Thursday September 07 2017, @04:21PM (5 children)

      by mcgrew (701) <publish@mcgrewbooks.com> on Thursday September 07 2017, @04:21PM (#564634) Homepage Journal

      Hacks are a problem because programmers aren't smart enough to write good code. That "every program bigger than 'hello world' is buggy" is rank bullshit.

      --
      mcgrewbooks.com mcgrew.info nooze.org
      • (Score: 3, Insightful) by bob_super on Thursday September 07 2017, @06:50PM (4 children)

        by bob_super (1357) on Thursday September 07 2017, @06:50PM (#564711)

        Writing good safe code takes time and planning.
        We're shipping next week, push anything not critical into the next OTA update.

        • (Score: 2) by JoeMerchant on Friday September 08 2017, @12:13PM (2 children)

          by JoeMerchant (3937) on Friday September 08 2017, @12:13PM (#565031)

          This particular hack is dependent on a lack of diversity to make it work. If different models of phone used slightly different front ends with different sampling rates and cutoff frequencies, then the hack would have to be tailored to each target. Instead, industry has settled on a homogeneous solution, and therefore the exploit works everywhere.

          It's like planting a field with one variety of corn - if a blight hits the field, it can take out the entire crop, and quickly, spreading from neighboring plant to neighboring plant like fire in dry grass. If, instead, the field is planted with diverse crops, or even diverse types of corn, the blight might never spread from the first plant it infects, since it is surrounded by plants that are resistant to that particular - finely tuned, highly infectious to one type of corn - blight.

          --
          🌻🌻 [google.com]
          • (Score: 2) by bob_super on Friday September 08 2017, @04:22PM (1 child)

            by bob_super (1357) on Friday September 08 2017, @04:22PM (#565176)

            I'm gonna have to disagree.
            It not lack of diversity, in this particular case. It's about convenience. Voice authentication is hard, and very sensitive microphones picking up as much frequency as possible can help.
            If the customer has to repeat orders in the same exact voice and frequency as they did during setup, they'll get rid of the useless invasive toy, which is not good for the ecosystem behind.

            The easy answer is to not bother with safety, to save design/debug time and to make it convenient (the MS school of design). And if it turns out that orders can be processed despite a frequency the human throat cannot generate, can always get back to that with an update later.
            Lack of diversity? Yay for competition! Nobody has time to do it right as they try to leapfrog each other.

            • (Score: 2) by JoeMerchant on Friday September 08 2017, @09:36PM

              by JoeMerchant (3937) on Friday September 08 2017, @09:36PM (#565338)

              "As much frequency as possible" doesn't have to stop at any particular ceiling, some can go to 38KHz, others to 39KHz, 44KHz, or even 60KHz if they wanted to. Even a small difference in sampling rate would distort this exploit to the point that it wouldn't work - it's dependent on the aliasing to always be at exactly the same frequency. A difference of 500Hz would make the aliased voice unintelligible.

              --
              🌻🌻 [google.com]
        • (Score: 2) by mcgrew on Saturday September 09 2017, @02:20PM

          by mcgrew (701) <publish@mcgrewbooks.com> on Saturday September 09 2017, @02:20PM (#565653) Homepage Journal

          Indeed. You have fast, cheap, or quality. You can have any two of them but never all three.

          --
          mcgrewbooks.com mcgrew.info nooze.org
  • (Score: 2) by hemocyanin on Thursday September 07 2017, @02:14PM (13 children)

    by hemocyanin (186) on Thursday September 07 2017, @02:14PM (#564566) Journal

    Suffice to say, it works even if the attacker has no device access and the owner has taken the necessary security precautions.

    What does this mean -- I RTFAed but this is not explained. Maybe it is because I woke up 10 minutes ago but it isn't making sense to me.

    • (Score: 2) by hemocyanin on Thursday September 07 2017, @02:15PM

      by hemocyanin (186) on Thursday September 07 2017, @02:15PM (#564567) Journal

      Apparently can't close tags either.

    • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @02:20PM (3 children)

      by Anonymous Coward on Thursday September 07 2017, @02:20PM (#564573)

      There is something marketing-like about this article, I'm not quite sure for what. It is like that Geico commercial where claims are placed in strange places: "Did you know scientists say it snows on the moon now? In the future maybe people will being making snowmen on the moon and gieco is a great company." And it worked apparently, because I remembered which company it was.

      • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @05:01PM (2 children)

        by Anonymous Coward on Thursday September 07 2017, @05:01PM (#564647)

        I think it's because people seriously think there's a cute girl in their phone responding to their every command. Some voice assistants are empowered women as well with authorization to report sexual harassment. If I'm doing my psychology correctly, the little woman in the phone responding to somebody else's voice that you can't hear must be akin to a strange man whispering in one's wife's ear.

        At least, that may be true if you're a heterosexual man. I'm not one of those so what the fuck do I know.

        I just want a butler. I want a loyal manservant I can confidently and implicitly trust with even the most confidential matters. I don't know why anybody would want a woman for a servant. As far as sexual harassment, I don't know why anybody would view their faithful servant as a sexual object. The power dynamics are all fucked up. If I wanted a boyfriend, I'd go to bars more often and meet people. I want butler, just a butler.

        Circle of protection: Freud. Sometimes a cigar is just a cigar, even if you're a person of indeterminate gender who prefers dating men.

        • (Score: 2) by Arik on Friday September 08 2017, @12:17AM (1 child)

          by Arik (4543) on Friday September 08 2017, @12:17AM (#564848) Journal
          "I think it's because people seriously think there's a cute girl in their phone responding to their every command. Some voice assistants are empowered women as well with authorization to report sexual harassment. If I'm doing my psychology correctly, the little woman in the phone responding to somebody else's voice that you can't hear must be akin to a strange man whispering in one's wife's ear."

          It's just creepy.

          "At least, that may be true if you're a heterosexual man. I'm not one of those so what the fuck do I know."

          I'm thinking 'heterosexual men' is still too wide a category for what you're actually thinking of here, but go on.

          "I just want a butler. I want a loyal manservant I can confidently and implicitly trust with even the most confidential matters."

          That may be both too much and too little to ask for. Butlers are very complicated entities, but complications are the natural enemies of trust.

          I had a friend who grew up with a house full of servants, when I first heard that my eyes went wide, that seemed so very cool. But it turned out she hated it. Because she never had any privacy. The servants were, in a sense, the masters - everyone living in the house, it seems, lived in fear of doing something that the servants would find amusing enough to repeat...

          --
          If laughter is the best medicine, who are the best doctors?
          • (Score: 0) by Anonymous Coward on Friday September 08 2017, @01:09PM

            by Anonymous Coward on Friday September 08 2017, @01:09PM (#565065)

            LOOK AT ME!!!!!!!!!!!!!!!!!!!!!!!1111111111

    • (Score: 5, Informative) by Hyperturtle on Thursday September 07 2017, @02:51PM

      by Hyperturtle (2824) on Thursday September 07 2017, @02:51PM (#564588)

      It means that an attacker can play commands on his boombox at a loud volume, that no one can hear because it is ultrasonic, and that even a locked car will hear it through the windows, home control devices, smart tvs, phones, alexas, idevices etc, whether user administratively secured or completely locked down by a vendor with a microphone that cannot be disabled without a knife, are vulnerable.

      "necessary security precautions" are common tasks end users can do, like changing default passwords that have nothing to do with this, because it's not about admin rights necessarily, but nefarious commands that are already permitted coming in "silently" via unknown sources.

      A twitch game stream, or youtube video etc, can easily be created to play out these commands and do significant harm to devices within audible range of the speakers without the user even being aware due to their focus being on that video they are watching.

      This really isn't too different from marketers using the same functionality on smart tvs, to determine what users are listening and what their hardware is, since modern phones listen for this as a feature.

      The real news here is that non-licensed and non-business partners can make use of it, making a feature into a 'known issue' that is bad because the wrong people are profiting from it. But that's not news; that was predictable.

    • (Score: 2) by mcgrew on Thursday September 07 2017, @04:24PM (6 children)

      by mcgrew (701) <publish@mcgrewbooks.com> on Thursday September 07 2017, @04:24PM (#564637) Homepage Journal

      I read a different article. It said the attacker would have to be in the same room with you.

      This hack would be trivial to defeat, just limit the microphone's high end frequency response to the range of human hearing. Problem solved, it would only take a single coil or capacitor.

      --
      mcgrewbooks.com mcgrew.info nooze.org
      • (Score: 1) by Tara Li on Thursday September 07 2017, @05:13PM (5 children)

        by Tara Li (6248) on Thursday September 07 2017, @05:13PM (#564655)

        Or it could even be done at the cloud level, since the devices are doing no speech recognition of their own - they just ship the data off to the cloud, and get a data stream in return to be played. I expect the devices *could* have been implemented in the 80386 days, honestly.

        • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @05:50PM (1 child)

          by Anonymous Coward on Thursday September 07 2017, @05:50PM (#564673)

          Which makes me wonder why they didn't get this security fix for free by sampling at a rate that wouldn't include those frequencies, ie capture a 40kHz stream and everything above 20kHz has to be fitlered out to avoid aliasing artifacts.... Are they sending 48 or 96 kHz just so this hack works?

          • (Score: 3, Funny) by bob_super on Thursday September 07 2017, @06:54PM

            by bob_super (1357) on Thursday September 07 2017, @06:54PM (#564713)

            Wouldn't want to miss some the ultrasonic audio processing which tells them whether you're banging or just murdering someone.
            You don't want to accidentally get bleach when you need acid.

        • (Score: 3, Informative) by VLM on Thursday September 07 2017, @06:54PM (2 children)

          by VLM (445) on Thursday September 07 2017, @06:54PM (#564714)

          The game that's being played is non-linear mixing, so the cloud won't help. The problem is the mic and preamp before the cloud hears it.

          So you feed less than 10 volts of 42 KHz and 44 KHz ultrasound thru a top quality audio mixing board and you get ... 10 volts of 42 KHz and 44 KHz at the output. Very linear. Not a peep at 2 KHz even though 10 volts is over spec a bit.

          Anything non-linear, like a preamp running right at the ragged edge, will result in some level of mixing products being generated, so 10 volts of 42 KHz and 10 volts of 44 KHz in the preamp of an Alexa, given that Alexa isn't a studio quality ultra high linearity mic and soundboard, will result in a horrendous distorted mix of 42 KHz, 44 KHz, and 2 KHz (and also 86 KHz, and harmonics...)

          Kinda like an audio amp driven into distortion often (not always) blows the tweeters not the woofers.

          • (Score: 1) by Tara Li on Thursday September 07 2017, @07:09PM (1 child)

            by Tara Li (6248) on Thursday September 07 2017, @07:09PM (#564722)

            Soooo... They're getting some kind of harmonic that just happens to be in the voice range?

            • (Score: 2) by VLM on Friday September 08 2017, @01:27AM

              by VLM (445) on Friday September 08 2017, @01:27AM (#564871)

              exactly yes. nonlinearity in the analog stuff makes a nice mixer... Most electronics are very linear until they aren't (at high levels or whatever)

  • (Score: 2, Insightful) by Anonymous Coward on Thursday September 07 2017, @02:20PM (5 children)

    by Anonymous Coward on Thursday September 07 2017, @02:20PM (#564572)

    Why is audio outside some ranges accepted? Throw all the input through some bandpass filter (not really rocket science) before parsing it and this should not be an issue any more.

    • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @03:58PM (2 children)

      by Anonymous Coward on Thursday September 07 2017, @03:58PM (#564618)

      That's one question, but the other question is why voice recognition is even enabled without the user specifically activating it.

      I used to have a heart rate monitor that would use these weird clicks and buzzes to send information to my computer, but those were very much within the range I could hear, even if t hey were somewhat unpleasant to listen to. This just smacks of the kind of laziness and carelessness that comes from allowing companies to require blanket waivers to cover their incompetence and sloth.

      • (Score: 1, Informative) by Anonymous Coward on Thursday September 07 2017, @04:57PM

        by Anonymous Coward on Thursday September 07 2017, @04:57PM (#564645)

        why voice recognition is even enabled without the user specifically activating it.

        Um ... some of these require that it be enabled. But all of these are specifically designed to work with voice commands. That's their gig.

      • (Score: 1, Informative) by Anonymous Coward on Thursday September 07 2017, @05:08PM

        by Anonymous Coward on Thursday September 07 2017, @05:08PM (#564652)

        Why are we even bothering with voice recognition when we have no well tested and proven technology to recognize whose voice it is?

        I'm just skimming because I'm bored and this subject is such a bore, so maybe I missed the oblig XKCD. Alexa, order two tons of creamed corn. Alexa, confirm purchase. [xkcd.com]

    • (Score: 2) by ese002 on Thursday September 07 2017, @05:49PM (1 child)

      by ese002 (5306) on Thursday September 07 2017, @05:49PM (#564672)

      Why is audio outside some ranges accepted? Throw all the input through some bandpass filter (not really rocket science) before parsing it and this should not be an issue any more.

      Naturally because this sort of attack was not considered and deliberately reducing function in the face of not fully known usage is seldom a good use of resources.

      However, people have been talking for quite a while how voice controlled systems should be trained to recognize authorized users and not respond to other voices. If that had actually been done, this "discovery" would have been a no-op assuming the the authorized user isn't a gerbil.

      • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @11:26PM

        by Anonymous Coward on Thursday September 07 2017, @11:26PM (#564822)

        Yeah. Combine your comment with the AC's comment, currently directly above yours, concerning XKCD (I remembered that one too) and you quickly see just how low-tech an exploit could be mounted against this crappy half-done "design".

        Add in the Little Bobby Tables comment.
        Now, add in the bandpass filters already mentioned.

        The level of "engineering" in this concept is severely lacking.
        Did these "engineers" never before encounter the concept of "what if"?
        How about "design review"?

        -- OriginalOwner_ [soylentnews.org]

  • (Score: 5, Insightful) by ledow on Thursday September 07 2017, @03:05PM (10 children)

    by ledow (5567) on Thursday September 07 2017, @03:05PM (#564595) Homepage

    Such systems were incredibly stupid from day one anyway.

    Because - NO MATTER WHAT PEOPLE CLAIM - your voice is not a credential. It's a "username" at best, i.e. indicating who you are intending to authenticate as. At no point does any voice authentication happen (or can happen reliably), and at no point are the commands you send to those devices requesting knowledge of who the user is. Literally it's a free-for-all, open command prompt, to anyone, anywhere within audible range.

    What's to stop someone issuing commands to make your device browse nefarious websites that will get you flagged for illegal acts? Nothing. Absolutely nothing beyond saying "Siri, please read out www.whatever.com".

    The idiots who buy these things, put them in their living room, have them listening 24/7 for ANYTHING THAT SOUNDS LIKE A COMMAND and acting upon it, are morons. Especially those people who tie them into home control, etc. devices, purchase accounts, and so on.

    You might as well just leave a computer logged in and automatically accepting any Bluetooth request from any device, and let your neighbour type on it from their living room.

    Even if they put an ultrasonic filter on, these things are STILL STUPID. A recording of your voice or even ANY voice, is enough to perform commands. Like the XBox adverts that powered off everyone's Xbox's and all the other problems - voice control is stupid, pointless and downright dangerous out of literal toy applications.

    • (Score: 2) by DannyB on Thursday September 07 2017, @03:44PM (5 children)

      by DannyB (5839) Subscriber Badge on Thursday September 07 2017, @03:44PM (#564611) Journal

      voice control is stupid, pointless and downright dangerous out of literal toy applications.

      Once upon a time I thought of voice control as fantastically convenient. That was in an era when PowerMac had crude voice recognition, scotch taped to AppleScript actions in the voice commands folder. The AppleScript could then inter operate with X10 software controlling X10 devices. (Yes, this was the mid 1990's.) It seemed so amazingly cool. Except the voice recognition didn't work very well. You had to hold the microphone near your mouth. It was not tolerant of any kind of background noise. Etc.

      Voice control seemed great on Star Trek: The Next Generation. (ST:TNG) They never seemed to address the issue that someone could simulate someone else's voice and give commands. (Destruct sequence 1, code 1-1 A) In an original 1960's Star Trek series episode (ST:TOS), "A Taste of Armageddon", on the Enterprise, Spock realizes that a command he is receiving from Kirk could not be true, but yet his equipment could detect that it wasn't produced by a "voice synthesizer", whatever that means.

      I suppose on ST:TNG, the computer would realize the actual whereabouts of the person whose voice it is recognizing. Or it wouldn't care what the voice sounds like, but could identify which exact person in a room is the speaker giving the command, and identify them as a person authorized to do so.

      You might as well just leave a computer logged in and automatically accepting any Bluetooth request from any device, and let your neighbour type on it from their living room.

      In the meantime, I definitely won't use bluetooth on that logged in computer on my back porch.

      Better yet, why not just put the Alexa or Google Home (or both!) on the back porch. For your convenience.

      --
      When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
      • (Score: 2) by tangomargarine on Thursday September 07 2017, @05:09PM (3 children)

        by tangomargarine (667) on Thursday September 07 2017, @05:09PM (#564653)

        Voice control seemed great on Star Trek: The Next Generation. (ST:TNG) They never seemed to address the issue that someone could simulate someone else's voice and give commands. (Destruct sequence 1, code 1-1 A)

        Wasn't there an episode where Data went crazy and took over the ship using Picard's command codes? Then he locked them out with like a 30-character password and everybody was flummoxed.

        here we go [wikipedia.org]

        I mean, voice recognition and the command codes was a two-factor system; it's just that the codes themselves are super short and saying them out loud to authenticate doesn't beat the eavesdropper test.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 3, Informative) by DannyB on Thursday September 07 2017, @05:30PM (2 children)

          by DannyB (5839) Subscriber Badge on Thursday September 07 2017, @05:30PM (#564663) Journal

          The three factors for authentication:
          1. Something you know. (eg, a code, a password, a PIN, an algorithm)
          2. Something you have. (eg, a metal key, a credit card, a code-generating key-fob, a mobile phone)
          3. Something you are. (eg, your fingerprint, your voice, your retina scan, your blood, semen, DNA)
          (If you know of anything more than these three -- please publish at once and become famous!)

          Two factor authentication uses any two from the above list.

          The failure in the ST:TNG episode "Brothers" is that the system probably should have verified factor 3, something you are. It should be positive that the actual authentic person is giving the command (authentication), and then verify they are authorized to give that command (authorization).

          If Cmdr Data could could simply use Picard's voice as the "something you are" factor, and recite the command codes "something you know", then there are two failures here.
          1. A voice that sounds like Picard, is not a very good "something you are" test.
          2. Anything that is a "something you know" test should never be stated aloud in an episode. Because now the TV audience knows.

          Aside . . . remember back to the 1970's, a TV series called "Space 1999"? Remember Barbara Bain's monotone emotionless "acting"? I suspect THAT is where they got the idea for Cmdr Data!

          --
          When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
          • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @11:32PM

            by Anonymous Coward on Thursday September 07 2017, @11:32PM (#564826)

            Hock a lugi onto the machine and let it sequence your DNA.

            -- OriginalOwner_ [soylentnews.org]

          • (Score: 2) by Pslytely Psycho on Friday September 08 2017, @12:06PM

            by Pslytely Psycho (1218) on Friday September 08 2017, @12:06PM (#565030)

            "3. Something you are. (eg, your fingerprint, your voice, your retina scan, your blood, semen, DNA"

            Plot-lines of so very many movies and TV shows, Salt, SNG, Minority Report, Ultraviolet, Two Broke Girls (Beth Behrs did a lot of semen testing), Alien Resurrection's breath analyzer or GATTACA to cover most of them at once.
            Yes, I watch a lot of bad movies and your post brought examples bubbling to the surface. Especially when you mentioned Barbara Bain, man, by comparison she made Data and Spock look like emotional wrecks! I think she may of been an actual robot....

            I raised my children on MST3K. I am a horrible person with neither culture or taste.

            --
            Alex Jones lawyer inspires new TV series: CSI Moron Division.
      • (Score: 2) by bob_super on Thursday September 07 2017, @07:00PM

        by bob_super (1357) on Thursday September 07 2017, @07:00PM (#564720)
    • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @04:03PM (2 children)

      by Anonymous Coward on Thursday September 07 2017, @04:03PM (#564624)

      Everyone knows this works well in practice. See Exhibit A [youtube.com].

      • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @05:26PM (1 child)

        by Anonymous Coward on Thursday September 07 2017, @05:26PM (#564660)

        I was thinking of the Voice Print Identification scene in 2001: A Space Odyssey, but I can't find the scene on Youtube.

        (I assume Voice Print Identification™ is a proper noun.)

        Hmm... since I'm pissed off at the owner of Youtube, Google, and Google's owner, Alphabet, anyway, I was thinking it might be nice if I could just run mencoder, snip out the part of 2001 I want to highlight/share, put it on some hybrid Freenet/torrent type service (because I'm not going to seed this stupid clip until the end of time, and I don't know why anybody would seed it in the manner one seeds torrents--Freenet's model strikes me as more appropriate), and paste a link that'll look an awful lot like a magnet link here.

        I think I'll be off now to familiarize myself more with Freenet, since I haven't done anything much more than running a node at home. We need a good, informal, reliable, distributed way to share content without need of the services of megacorps that are involved in the tawdry business of slandering and libeling entire demographics and professions while making bank off flouting our privacy and freedoms.

        (Maybe Freenet itself is sufficient except for my ignorance??? Guess I'll find out.)

        • (Score: 2) by DannyB on Thursday September 07 2017, @05:37PM

          by DannyB (5839) Subscriber Badge on Thursday September 07 2017, @05:37PM (#564665) Journal

          I was thinking of the Voice Print Identification scene in 2001: A Space Odyssey

          When Dr. Heywood Floyd first arrives at the orbiting wheel space station?

          I'm thinking of the voice test in the ST:TOS episode "The Conscience of the King" [wikia.com]. An actor suspected of being a war criminal is asked to do a voice comparison test. Captain Kirk says, try to disguise your voice, it won't matter. There will be no doubt. Etc.

          --
          When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
    • (Score: 3, Interesting) by nobu_the_bard on Thursday September 07 2017, @04:22PM

      by nobu_the_bard (6373) on Thursday September 07 2017, @04:22PM (#564635)

      Heh. It doesn't even matter if people are abusing it intentionally...

      The other day Google overheard something we were talking about when a friend left their phone on the table and walked away. The phone apparently misinterpreted something we said as something like "Google, read my latest email" and read a very personal email out loud to us. We were very surprised and confused (nobody had even noticed the phone was there or that it was accepting voice commands); it was pretty embarrassing for everyone...

      *Note: I only think it was Google. I didn't hear the first part and the friend quickly stuffed the phone into his pocket and we all silently decided to pretend it didn't happen.

  • (Score: 2) by DannyB on Thursday September 07 2017, @03:33PM

    by DannyB (5839) Subscriber Badge on Thursday September 07 2017, @03:33PM (#564608) Journal

    Imagine that there is a box. Think outside the box.

    Consider that some number of Alexa devices interact with home automation equipment. "Alexa, turn on the living room lights."

    Idea: "Alexa, turn off all the lights in the world."

    --
    When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
  • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @03:44PM (4 children)

    by Anonymous Coward on Thursday September 07 2017, @03:44PM (#564610)

    Maybe these kind of attacks could be prevented with a Deafening Observant Garbler, which detects the ultrasound and prevents voice recognition by generating loud noises.

    • (Score: 1, Informative) by Anonymous Coward on Thursday September 07 2017, @04:00PM (3 children)

      by Anonymous Coward on Thursday September 07 2017, @04:00PM (#564619)

      Or just filter everything out that isn't in the range people can hear. There's no legitimate reason for these devices to be able to hear ultrasonic frequencies anyways.

      • (Score: 1, Insightful) by Anonymous Coward on Thursday September 07 2017, @04:10PM (2 children)

        by Anonymous Coward on Thursday September 07 2017, @04:10PM (#564627)

        What makes you think this is not intentional? Apply Hanlon's Razor if you like, but in these times I prefer to go with a much sharper instrument: tinfoil.

        • (Score: 1, Informative) by Anonymous Coward on Thursday September 07 2017, @05:02PM

          by Anonymous Coward on Thursday September 07 2017, @05:02PM (#564648)

          What makes you think this is not intentional?

          You mean it may be by design? [arstechnica.com]

        • (Score: 3, Funny) by Osamabobama on Thursday September 07 2017, @09:20PM

          by Osamabobama (5842) on Thursday September 07 2017, @09:20PM (#564764)

          I prefer to go with a much sharper instrument: tinfoil

          I think for ultrasonic sound, you would get better attenuation with felt or a thin layer of foam.

          --
          Appended to the end of comments you post. Max: 120 chars.
  • (Score: 3, Informative) by DannyB on Thursday September 07 2017, @04:16PM

    by DannyB (5839) Subscriber Badge on Thursday September 07 2017, @04:16PM (#564631) Journal

    On Hacker News, there is a link to this PDF [endchan.xyz] which describes the attack in more detail.

    --
    When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
  • (Score: 2) by xorsyst on Friday September 08 2017, @08:41AM

    by xorsyst (1372) on Friday September 08 2017, @08:41AM (#564985)

    Suffice to say, it works even if the attacker has no device access and the owner has taken the necessary security precautions.

    Surely just disabling implicit voice commands is a sufficient security precaution? On my android phone it only listens if I press the microphone button. "Ok google" doesn't trigger it.

  • (Score: 2) by KritonK on Tuesday September 12 2017, @10:06AM

    by KritonK (465) on Tuesday September 12 2017, @10:06AM (#566658)

    Why bother with ultrasound, if all one needs to open your door is go to your house and say "Alexa/Siri, open the door"?

    Oh, and citizen, if you ever VISIT PROSCRIBEDSITE.GOV, you'll be in deep trouble—hey, is that site on your browser what I think it is?

(1)