Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 31 2018, @07:42AM   Printer-friendly
from the Don't-be-Evil? dept.

Google promises ethical principles to guide development of military AI

Google is drawing up a set of guidelines that will steer its involvement in developing AI tools for the military, according to a report from The New York Times. What exactly these guidelines will stipulate isn't clear, but Google says they will include a ban on the use of artificial intelligence in weaponry. The principles are expected to be announced in full in the coming weeks. They are a response to the controversy over the company's decision to develop AI tools for the Pentagon that analyze drone surveillance footage.

[...] But the question facing these employees (and Google itself) is: where do you draw the line? Does using machine learning to analyze surveillance footage for the military count as "weaponized AI"? Probably not. But what if that analysis informs future decisions about drone strikes? Does it matter then? How would Google even know if this had happened?

Also at VentureBeat and Engadget.

Previously: Google vs Maven
Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"
About a Dozen Google Employees Have Resigned Over Project Maven


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Hartree on Thursday May 31 2018, @09:53PM

    by Hartree (195) on Thursday May 31 2018, @09:53PM (#686944)

    Can't blame anything on those googlers. Their company has a policy that says they won't do evi... Oh wait. They removed that. We meant we won't do bad things for the military. Just don't notice the cloud computing contracts we're bidding on for DOD and the intel agencies. It's just computing. We can't know what they're doing with it. It's all... Cute kitten pics. Yeah, that's it. And we certainly wouldn't use AI for combat. We just make AI to play extremely realistic combat simulatio... I mean games! We certainly can't control what others might do with our completely innocent creations, like hooking them up to a remote control firing station mounted on an autonomous vehicle.

    This is why in a previous article on Soylent I referred to it as peace washing or preaching celibacy in a brothel.

    I understand why people might have reservations about doing such work. I personally don't (I was in the military for a dozen years). It's a moral questions that people have to decide for themselves and I don't minimize it. But this smacks to me of trying to have it both ways by both working for what is effectively a military contractor but still being able to rationalize it away because of a "policy".

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2