Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 31 2018, @09:55PM   Printer-friendly
from the OK-Google,-open-the-pod-bay-doors dept.

Google Assistant fired a gun: We need to talk

For better or worse, Google Assistant can do it all. From mundane tasks like turning on your lights and setting reminders to convincingly mimicking human speech patterns, the AI helper is so capable it's scary. Its latest (unofficial) ability, though, is a bit more sinister. Artist Alexander Reben recently taught Assistant to fire a gun. Fortunately, the victim was an apple, not a living being. The 30-second video, simply titled "Google Shoots," shows Reben saying "OK Google, activate gun." Barely a second later, a buzzer goes off, the gun fires, and Assistant responds "Sure, turning on the gun." On the surface, the footage is underwhelming -- nothing visually arresting is really happening. But peel back the layers even a little, and it's obvious this project is meant to provoke a conversation on the boundaries of what AI should be allowed to do.

As Reben told Engadget, "the discourse around such a(n) apparatus is more important than its physical presence." For this project he chose to use Google Assistant, but said it could have been an Amazon Echo "or some other input device as well." At the same time, the device triggered "could have been a back massaging chair or an ice cream maker."

But Reben chose to arm Assistant with a gun. And given the concerns raised by Google's Duplex AI since I/O earlier this month, as well as the seemingly never-ending mass shootings in America, his decision is astute.

"OK Google, No more talking." / "OK Google, No more Mr. Nice Guy." / "OK Google, This is America." / "OK Google, [Trigger word]."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Freeman on Thursday May 31 2018, @10:03PM (10 children)

    by Freeman (732) on Thursday May 31 2018, @10:03PM (#686947) Journal

    Ok, Google, Make me a sandwich.

    --
    Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 3, Touché) by Anonymous Coward on Thursday May 31 2018, @10:14PM (3 children)

      by Anonymous Coward on Thursday May 31 2018, @10:14PM (#686955)

      Ok, Google, Make me a sandwich.

      Dear Freeman you may now identify as a sandwich. -- Google

      • (Score: 2) by Freeman on Thursday May 31 2018, @11:03PM (2 children)

        by Freeman (732) on Thursday May 31 2018, @11:03PM (#686974) Journal

        The AI needs much improving.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 4, Funny) by Hartree on Thursday May 31 2018, @10:26PM (2 children)

      by Hartree (195) on Thursday May 31 2018, @10:26PM (#686961)

      "Ok, Google, Make me a sandwich."

      Uh... Google, just why are you chasing me with a carving knife and a loaf of bread?

      • (Score: 2) by Freeman on Thursday May 31 2018, @11:10PM (1 child)

        by Freeman (732) on Thursday May 31 2018, @11:10PM (#686977) Journal

        This AI needs to be 3 laws compliant. This is what we get when we aren't 3 laws compliant.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 4, Funny) by PartTimeZombie on Thursday May 31 2018, @11:33PM (1 child)

      by PartTimeZombie (4827) on Thursday May 31 2018, @11:33PM (#686983)
      • (Score: 0) by Anonymous Coward on Friday June 01 2018, @12:59PM

        by Anonymous Coward on Friday June 01 2018, @12:59PM (#687213)

        Shouldn't he ask for the password first?

    • (Score: -1, Troll) by Anonymous Coward on Friday June 01 2018, @04:55AM

      by Anonymous Coward on Friday June 01 2018, @04:55AM (#687098)

      Ok, Google, Make me a jew.

      *Google chases you with a knife and cuts off half your penis and sucks off what is left*

  • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @10:03PM

    by Anonymous Coward on Thursday May 31 2018, @10:03PM (#686948)

    He will swoop in and tell us how embracing Skynet is the best idea since Capitalism!

  • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @10:03PM

    by Anonymous Coward on Thursday May 31 2018, @10:03PM (#686949)

    Doesn't this rig belong in a box with Schrödinger's cat?

  • (Score: 2, Informative) by Anonymous Coward on Thursday May 31 2018, @10:04PM (8 children)

    by Anonymous Coward on Thursday May 31 2018, @10:04PM (#686950)

    So if I rig a gun to a Clapper, do I deserve press coverage as well? Is the "triggering" mechanism really what's important here?

    • (Score: 1, Informative) by Anonymous Coward on Thursday May 31 2018, @10:05PM (3 children)

      by Anonymous Coward on Thursday May 31 2018, @10:05PM (#686951)

      Nope, new inventions these days have to be, "xxx on the Internet" or "yyy in the Cloud".

      • (Score: 3, Touché) by c0lo on Thursday May 31 2018, @10:57PM (1 child)

        by c0lo (156) Subscriber Badge on Thursday May 31 2018, @10:57PM (#686971) Journal

        Nope, new inventions these days have to be, "xxx on the Internet"

        Naw, mate. "xxx on the internet" is as old as the internet.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2) by MostCynical on Thursday May 31 2018, @11:09PM

          by MostCynical (2589) on Thursday May 31 2018, @11:09PM (#686976) Journal

          But .xxx [wikipedia.org] has only been around since 2011...

          --
          "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
      • (Score: 2) by MichaelDavidCrawford on Friday June 01 2018, @01:49AM

        USB Microphones don't sell.

        --
        Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by sjames on Thursday May 31 2018, @10:19PM

      by sjames (2882) on Thursday May 31 2018, @10:19PM (#686957) Journal

      For maximum hilarity, it should be rigged to fire twice.

    • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @10:34PM

      by Anonymous Coward on Thursday May 31 2018, @10:34PM (#686963)

      No just Google assistants like this one [youtube.com] and the triggering mechanism is known [gizmodo.com]

    • (Score: 2) by urza9814 on Friday June 01 2018, @02:35AM

      by urza9814 (3954) on Friday June 01 2018, @02:35AM (#687038) Journal

      Eh, that's all this demo is, but I think you're missing the bigger picture here.

      You've been able to verbally command a computer to turn on a light or trigger whatever else for decades. The purpose of these new AI assistants is that they bring a lot more of their own decision making capabilities (or at least they're attempting to). If you say "OK Google, fire the gun" and it shoots someone, that's no different from you pulling the trigger. But what if you tell it "OK Google, if any face you don't recognize comes through this door, fire the gun" -- with your intention being to "protect from intruders", which is perfectly legal in many jurisdictions...but then your house catches on fire, the neighbors call the fire department, and your Google gun kills a firefighter. These things are sold as being "intelligent", but they're far from being able to handle the kinds of decisions that people are throwing at them.

      I don't think it's something that really can be prevented, but I do think it's worth discussing in terms of how these things are marketed. The way so many devices are sold these days, they market a dream of what the software might one day be capable of (and they're not always wrong even) but they give people a false impression of its capabilities today. If you tell people it can recognize faces and objects and it can make intelligent decisions, and you don't give the proper context about what the limitations of those abilities are, then it's not entirely unreasonable that someone might think some crazy stunt -- like plugging it in to a gun for a DIY defense turret -- is a reasonable idea.

      I'm sure most of us here have seen some horrors created by newbie developers...now imagine every person alive gets the ability to program nearly any device they own by having the device attempt to parse natural language into code....

      And a big part of our legal system is focused on intent. Can you prove intent if an action was carried out through a poorly programmed smart device? You can kinda prove intent if someone writes a crappy computer program, because they gave step by step instructions...but natural language has a lot more ambiguity and implied meaning. So that could cause some issues too...

    • (Score: 2) by Weasley on Friday June 01 2018, @04:36PM

      by Weasley (6421) on Friday June 01 2018, @04:36PM (#687308)

      I've rigged my gun up to a small metal lever. I call it...The Trigger™. Just waiting for the path to be beaten to my door.

  • (Score: 2) by Snotnose on Thursday May 31 2018, @10:10PM (2 children)

    by Snotnose (1623) on Thursday May 31 2018, @10:10PM (#686952)

    Does Google aim the gun, or just pull the trigger. 50 years ago I could have come up with a machine to do the latter. Doing the former, while pretty neat, is also scary. Just think, a "machine gun" that will shoot only 240 rounds/minute. The caveat being each shot was aimed at a new target before pulling the trigger, instead of pulling the trigger and spraying.

    --
    When the dust settled America realized it was saved by a porn star.
    • (Score: 1, Informative) by Anonymous Coward on Thursday May 31 2018, @10:53PM

      by Anonymous Coward on Thursday May 31 2018, @10:53PM (#686969)

      https://en.wikipedia.org/wiki/Phalanx_CIWS [wikipedia.org]
      https://www.raytheon.com/capabilities/products/phalanx/ [raytheon.com]

      It does 4,500 rounds per minute. Each outgoing round is individually tracked by radar. The aim is adjusted based on this, compensating for anything that might otherwise cause misses.

    • (Score: 2) by JNCF on Thursday May 31 2018, @10:54PM

      by JNCF (4317) on Thursday May 31 2018, @10:54PM (#686970) Journal

      Not aiming, just pulling a trigger. Also, it's obviously an airsoft gun (or something similar). Given that the whole thing is essentially an art piece, the choice to not even use a real gun makes it pretty disappointing. Of course, TFA says nothing about that, cuz acting like it's a real gun gets people to click and share. Bullshit journalism is bullshit. There are much better toy gun turret videos, where the AI actually does aim. First example that popped up in google. [youtu.be]

  • (Score: 3, Interesting) by Nuke on Thursday May 31 2018, @10:37PM (2 children)

    by Nuke (3162) on Thursday May 31 2018, @10:37PM (#686964)

    Is his choice of an apple symbolic in some way? The message would have been clearer and made more of a stir if he had used an iPhone as the victim..

  • (Score: 2) by Nuke on Thursday May 31 2018, @10:41PM (1 child)

    by Nuke (3162) on Thursday May 31 2018, @10:41PM (#686967)

    OK Google, is that a gun in your carry-case or are you just glad to see me?

    • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @10:45PM

      by Anonymous Coward on Thursday May 31 2018, @10:45PM (#686968)

      OK Nuke, is that a Kim Jong-un in your pocket or are you North Korea?

  • (Score: 4, Interesting) by realDonaldTrump on Thursday May 31 2018, @11:08PM (4 children)

    by realDonaldTrump (6614) on Thursday May 31 2018, @11:08PM (#686975) Homepage Journal

    But there were a couple of stories, not long ago, about Google. Let me tell you, for a long time Google didn't want to do work for our military. When President Clinton (Bill), Bush Jr. & Obama were in charge. And the folks at Google said, Google cyber can make our military very powerful. But the guy running our Country -- our President -- is INCOMPETENT. And our cyber, possibly, could be put to very foolish uses. Against our allies, against very nice countries, even against our own people.

    So Google wouldn't take any new military contracts. They bought the #1 military robot company, Boston Dynamics. Ethanol-fueled can tell you all about that. And they worked on the contract that Boston Dynamics already had. The robot dog for the United States Marine Core. Let me tell you, they did a terrible job. They made it VERY LOUD. And that's no good. Because the way to win in war is to be QUIET. Except when you're shooting & bombing. And possibly the bosses at Google thought I'd do a terrible job like my predecessors. So they sold Boston Dynamics to my friends in Japan.

    But now the folks at Google see that I'm doing an incredible job. We have tremendous talent, and our talent and our strength is being respected again. Prosperity is booming, optimism is surging, and America is winning. We truly are Making America Great Again. But most importantly our Country is respected again all over the world. We're not making apologies, we're not making excuses. We're respected again as a Country. No more apologies! Our confidence is back -- I'm the confidence President. And the folks at Google said, "we love President Trump. We love America. And we love money." A lot of love there, believe me. So they signed up with my DOD, my Defense Department. For Project Maven, so important. It's about giving our drones amazing eyesight. Like our beautiful bald eagles, except it's cyber. So our drone pilots can relax. And let the cyber find the targets. Find the sick or bad dudes on my kill list. Amazing cyber, it's going to be a great thing for our Country.

    And the first related story was about how thousands of the Google workers -- the ones who hate our military, who hate our Country, who don't want us to win -- got very energized. And they signed a letter saying, please cancel. Don't do Project Maven. And don't do any projects for the Pentagon, for the Defense Department.

    But -- and this is the 2nd related story -- the big bosses at Google didn't agree. They said, "we love money, we love America, and most of all we love President Trump." God bless them. And about a dozen workers -- could be less -- quit. Thousands signed the letter -- we all love to sign our names, right? -- about a dozen quit. I think we can handle that, right? Even if it's treason -- can we call it treason? Why not?

    And this story, I love to read the articles. But frankly, I didn't read the article. Because it's one guy, it's a lone gunman. It's unofficial, it's not a guy at Google, or of Google. Just a guy who loves to shoot. And he's not even shooting people yet. He shot an Apple with cyber. Every day, so many people get shot, this was an Apple. While, as I said, Google is officially working on cyber eyes for our drones. To drop bombs -- I dropped the biggest bomb since WWII -- on A LOT of bad dudes. And probably some Apples.

    • (Score: 2, Funny) by Anonymous Coward on Thursday May 31 2018, @11:59PM (2 children)

      by Anonymous Coward on Thursday May 31 2018, @11:59PM (#686993)

      Every time I read one of your posts, I am both deeply horrified and...

      Well, actually I'm just deeply horrified.

      This is like season of 9 of Dallas, right? I'm going to wake up any minute now, go to work, and then look forward to watching the presidential debate between Bernie Sanders and John Kasich.

      • (Score: 1, Funny) by Anonymous Coward on Friday June 01 2018, @05:25AM (1 child)

        by Anonymous Coward on Friday June 01 2018, @05:25AM (#687111)

        I want to know who/what this is. I mean...
        It almost reads like a (almost) successful attempt at the turning test.

        • (Score: 1, Funny) by Anonymous Coward on Friday June 01 2018, @01:02PM

          by Anonymous Coward on Friday June 01 2018, @01:02PM (#687217)

          It almost reads like a (almost) successful attempt at the turning test.

          “Almost” because of taking the wrong turn?

    • (Score: 0) by Anonymous Coward on Friday June 01 2018, @12:50AM

      by Anonymous Coward on Friday June 01 2018, @12:50AM (#687004)

      realDonaldTrump (6614) [soylentnews.org] wrote [soylentnews.org]:

      But the guy running our Country -- our President -- is INCOMPETENT.

      [...] I'm the confidence President.

      [...] I think we can handle that, right? Even if it's treason -- can we call it treason? Why not?

      Oh, and it isn't "United States Marine Core", it's "United States Marine Corp."

  • (Score: 1, Funny) by Anonymous Coward on Thursday May 31 2018, @11:41PM

    by Anonymous Coward on Thursday May 31 2018, @11:41PM (#686987)

    omg omg omg omg Google GOOgle omg omg omg omg omg omg GoogleGoogle omg omg omg omg omg omg omg GoogGoogle omg omg omg omg omg omg A GUN A GUN IT'S A GUN OH IT'S A GUN omg omg omg omg omg omg

    Your comment violated the "badger" mushroom filter. Try fewer snakes and/or less repetition.

  • (Score: 2) by requerdanos on Friday June 01 2018, @12:48AM (1 child)

    by requerdanos (5997) Subscriber Badge on Friday June 01 2018, @12:48AM (#687003) Journal

    Google Assistant Used to Fire Gun Relay

    There. FTFY.

    Admittedly not nearly as clickbaity.

    • (Score: 2) by bob_super on Friday June 01 2018, @01:08AM

      by bob_super (1357) on Friday June 01 2018, @01:08AM (#687010)

      Google Assistant Used to Fire Gun Relay People.
      Coming Reaal Soon.

  • (Score: 2) by DutchUncle on Friday June 01 2018, @02:19PM

    by DutchUncle (5370) on Friday June 01 2018, @02:19PM (#687246)

    Isaac Asimov's robot stories posited that - through government regulation, or engineers' good sense - *all* robots were based on the Three Laws, the first of which was "A robot may not injure a human being, or, through inaction, allow a human being to come to harm." Many of the stories then dealt with how wrong things could go despite - or because of - the details of each situation in which one tries to follow that apparently simple law. The most basic workaround, discussed between two characters in (I think) "The Naked Sun", involves lack of knowledge and/or lack of context. One robot could be instructed to mix up a poisonous liquid "for use in the garden"; a different robot, unaware of the contents, could be instructed to take the liquid and use it in cooking; a third could serve the poisoned food to humans. Since the Google Assistant has no idea what it is "activating", and what it can do, all of the responsibility is on the human in this story.

(1)