Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Thursday April 05 2018, @01:37PM   Printer-friendly
from the what-is-GOOG-good-for? dept.

We had submissions from two Soylentils concerning recent employee reaction to Google's participation in the Pentagon's "Project Maven" program:

Google Workers Urge C.E.O. to Pull Out of Pentagon A.I. Project

Submitted via IRC for fyngyrz

Thousands of Google employees, including dozens of senior engineers, have signed a letter protesting the company's involvement in a Pentagon program that uses artificial intelligence to interpret video imagery and could be used to improve the targeting of drone strikes.

The letter [pdf], which is circulating inside Google and has garnered more than 3,100 signatures, reflects a culture clash between Silicon Valley and the federal government that is likely to intensify as cutting-edge artificial intelligence is increasingly employed for military purposes.

Source: https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html

Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"

Thousands of Google employees have signed a letter protesting the development of "Project Maven", which would use machine learning algorithms to analyze footage from U.S. military drones:

Last month, it was announced that Google was offering its resources to the US Department of Defense for Project Maven, a research initiative to develop computer vision algorithms that can analyze drone footage. In response, more than 3,100 Google employees have signed a letter urging Google CEO Sundar Pichai to reevaluate the company's involvement, as "Google should not be in the business of war," as reported by The New York Times.

Work on Project Maven began last April, and while details on what Google is actually providing to the DOD are not clear, it is understood that it's a Pentagon research initiative for improved analysis of drone footage. In a press statement, a Google spokesperson confirmed that the company was giving the DOD access to its open-source TensorFlow software, used in machine learning applications that are capable of understanding the contents of photos.

Previously: Google vs Maven


Original Submission #1 Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by The Mighty Buzzard on Thursday April 05 2018, @03:27PM (4 children)

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday April 05 2018, @03:27PM (#662968) Homepage Journal

    It's only evil if you are killing Good Guys.

    You say it sarcastically but it's true. War is not evil in and of itself. Neither is killing. Unfortunate but sometimes the most moral choice you can make.

    --
    My rights don't end where your fear begins.
    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 3, Insightful) by JoeMerchant on Thursday April 05 2018, @06:58PM (3 children)

    by JoeMerchant (3937) on Thursday April 05 2018, @06:58PM (#663052)

    War is not evil in and of itself. Neither is killing. Unfortunate but sometimes the most moral choice you can make.

    All true as well. The real problem is that there is no black and white, the good guys are bad and the bad guys are good... and which grey is darker is usually a matter of perspective.

    --
    🌻🌻 [google.com]
    • (Score: 3, Insightful) by frojack on Thursday April 05 2018, @07:41PM (2 children)

      by frojack (1554) on Thursday April 05 2018, @07:41PM (#663074) Journal

      I don't see lack of black and white as the main risk here.

      Once you've decided you are going to go to war with ISIS, that decision is already made.
      From then on the idea is to kill only the ISIS and not the people they were going to shovel into mass graves.
      If the AI can do that, great.

      The problem I see is that Military solutions don't stay in the Military, and pretty soon your local police force is using them.
      The Border Patrol has drones. Ok, maybe they need them. Problem is they loan them out [theatlantic.com] to other agencies.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 5, Interesting) by JoeMerchant on Thursday April 05 2018, @08:11PM

        by JoeMerchant (3937) on Thursday April 05 2018, @08:11PM (#663084)

        I told myself something similar while working for the UAV company: "we're just doing recon over the next hill in the Middle East, so grunts don't have to go look in person." Then we started supporting SWAT raids domestically, but that was O.K. because they were just busting bad guys, or Meth labs, or Marijuana grow houses...

        --
        🌻🌻 [google.com]
      • (Score: 3, Interesting) by urza9814 on Friday April 06 2018, @12:10AM

        by urza9814 (3954) on Friday April 06 2018, @12:10AM (#663195) Journal

        Once you've decided you are going to go to war with ISIS, that decision is already made.
        From then on the idea is to kill only the ISIS and not the people they were going to shovel into mass graves.
        If the AI can do that, great.

        Even that is not nearly so black and white. For one example: AI programmed only to kill members of an opposing force is going to be committing a ton of war crimes when it correctly identifies medics as working for the other side and starts blowing them away. And who gets held liable when something like that happens? Based on our current crop of AI systems, the answer is probably going to be "nobody". When a human does something stupid, we claim they should have known better. When an AI does something stupid because it was specifically programmed to, we often claim it's all just a terrible and unavoidable accident. And THAT is the biggest problem IMO. AI is a tool which corporations use to shift responsibility from people onto things...because you can't really throw a computer in prison.