Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday March 10 2018, @08:16AM   Printer-friendly
from the im-sorry-dave-im-afraid-i-cant-do-that dept.

Google is selling the Pentagon some Machine Learning / AI training solution so their drones and sensors can pick out the good stuff from all the crap stuff being recorded by their massive surveillance apparatus on a daily basis. Most companies would probably be super pleased by selling something to a customer. Not the Google-employees. Apparently their solutions should only be used for "good", or not being evil or something and Pentagon is clearly "evil" in their eyes.

Google has partnered with the United States Department of Defense to help the agency develop artificial intelligence for analyzing drone footage, a move that set off a firestorm among employees of the technology giant when they learned of Google's involvement.

Google's pilot project with the Defense Department's Project Maven, an effort to identify objects in drone footage, has not been previously reported, but it was discussed widely within the company last week when information about the project was shared on an internal mailing list, according to sources who asked not to be named because they were not authorized to speak publicly about the project.

Google's Eric Schmidt summed up the tech industry's concerns about collaborating with the Pentagon at a talk last fall. "There's a general concern in the tech community of somehow the military-industrial complex using their stuff to kill people incorrectly," he said. While Google says its involvement in Project Maven is not related to combat uses, the issue has still sparked concern among employees, sources said

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven's stated mission is to "accelerate DoD's integration of big data and machine learning." In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

Are the employees at Google starting to become a problem for Google and their eventual bottom line with their political agendas? Are they getting in the way of doing actual work? When or if is there such a line?

https://gizmodo.com/google-is-helping-the-pentagon-build-ai-for-drones-1823464533


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by NotSanguine on Saturday March 10 2018, @09:48PM (2 children)

    FYI, even in the military which is all about chain of command, you can still be court-martialed even if you were following orders. You are responsible for your own actions, regardless of what your superiors may tell you to do. So don't give me this crap that I'm supposed to be a mere cog in the machine and never question "the board" (who may well be more interested in selling out the company for their own financial gain than in the lasting good of anybody working there).

    That's not even close to what AC said. Just as in the military (where this has happened repeatedly) or in appointed government positions (where it happens frequently too), if you cannot, in good conscience, support the actions of your superiors, then you *resign*. And hopefully make a lot of noise doing it too.

    That's exactly what AC was advocating, and it's exactly what Google employees should do. If they don't support their employer partnering with the US DOD for this project, they should resign and explain as loudly as possible *why* they resigned.

    No one is forcing anyone at Google to do *anything* that violates their conscience. There's no discrimination or coercion involved here.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Sunday March 11 2018, @07:11PM (1 child)

    by Anonymous Coward on Sunday March 11 2018, @07:11PM (#651019)

    The problem is more basic, IMHO. Good students who have gotten good grades all their life and worked hard, have their own cocoon and they become uncomfortable outside of it. The Big Tech loves these people, as they are docile and hard working and they in turn maintain that cocoon by hiring the same kind of people. Unfortunately, these people have very little place to go to. The realrest of the world is quite difficult where a lot of other skills are required.

    Google's cocoon is all about no-evil-superiority complex, and this deals interferes from inside it.

    • (Score: 2) by NotSanguine on Sunday March 11 2018, @08:13PM

      An interesting point. And perhaps that's true of some (many?) Google employees. But there are other companies that feature jobs where poor social skills, long hours and cocoon-like environments exist.

      But even if there weren't, I don't have a lot of sympathy for those who would work on things that affront their conscience.*

      Ethics and principles are valuable things. And if one doesn't value such things, that says a lot about a person's character.

      I won't condemn folks who go that route, but I'm not likely to trust them either.

      *I'm not making a value judgement as to what should or shouldn't affront any particular person's conscience.

      --
      No, no, you're not thinking; you're just being logical. --Niels Bohr