Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 31 2018, @07:42AM   Printer-friendly
from the Don't-be-Evil? dept.

Google promises ethical principles to guide development of military AI

Google is drawing up a set of guidelines that will steer its involvement in developing AI tools for the military, according to a report from The New York Times. What exactly these guidelines will stipulate isn't clear, but Google says they will include a ban on the use of artificial intelligence in weaponry. The principles are expected to be announced in full in the coming weeks. They are a response to the controversy over the company's decision to develop AI tools for the Pentagon that analyze drone surveillance footage.

[...] But the question facing these employees (and Google itself) is: where do you draw the line? Does using machine learning to analyze surveillance footage for the military count as "weaponized AI"? Probably not. But what if that analysis informs future decisions about drone strikes? Does it matter then? How would Google even know if this had happened?

Also at VentureBeat and Engadget.

Previously: Google vs Maven
Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"
About a Dozen Google Employees Have Resigned Over Project Maven


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @11:20AM (1 child)

    by Anonymous Coward on Thursday May 31 2018, @11:20AM (#686672)

    All software is AI (according to today's marketing/definitions). The military invented software, the first computer (both electrical and mechanical) where developed by/for the military...even the algorithms they ran (cannon heating and projectile accuracy) were developed for the military. Drones have been in use for over 100 years.

    All these things are tools.

    Tools are immoral inanimate objects like rocks until someone picks one up and throws it.

  • (Score: 1) by anubi on Friday June 01 2018, @12:02PM

    by anubi (2828) on Friday June 01 2018, @12:02PM (#687195) Journal

    If a computer is involved, there is likely a "hold harmless" clause in there somewhere, and how do you invalidate one guy's noose without invalidating everyone's noose?

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]