Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 31 2018, @07:42AM   Printer-friendly
from the Don't-be-Evil? dept.

Google promises ethical principles to guide development of military AI

Google is drawing up a set of guidelines that will steer its involvement in developing AI tools for the military, according to a report from The New York Times. What exactly these guidelines will stipulate isn't clear, but Google says they will include a ban on the use of artificial intelligence in weaponry. The principles are expected to be announced in full in the coming weeks. They are a response to the controversy over the company's decision to develop AI tools for the Pentagon that analyze drone surveillance footage.

[...] But the question facing these employees (and Google itself) is: where do you draw the line? Does using machine learning to analyze surveillance footage for the military count as "weaponized AI"? Probably not. But what if that analysis informs future decisions about drone strikes? Does it matter then? How would Google even know if this had happened?

Also at VentureBeat and Engadget.

Previously: Google vs Maven
Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"
About a Dozen Google Employees Have Resigned Over Project Maven


Original Submission

Related Stories

Google vs Maven 60 comments

Google is selling the Pentagon some Machine Learning / AI training solution so their drones and sensors can pick out the good stuff from all the crap stuff being recorded by their massive surveillance apparatus on a daily basis. Most companies would probably be super pleased by selling something to a customer. Not the Google-employees. Apparently their solutions should only be used for "good", or not being evil or something and Pentagon is clearly "evil" in their eyes.

Google has partnered with the United States Department of Defense to help the agency develop artificial intelligence for analyzing drone footage, a move that set off a firestorm among employees of the technology giant when they learned of Google's involvement.

Google's pilot project with the Defense Department's Project Maven, an effort to identify objects in drone footage, has not been previously reported, but it was discussed widely within the company last week when information about the project was shared on an internal mailing list, according to sources who asked not to be named because they were not authorized to speak publicly about the project.

Google's Eric Schmidt summed up the tech industry's concerns about collaborating with the Pentagon at a talk last fall. "There's a general concern in the tech community of somehow the military-industrial complex using their stuff to kill people incorrectly," he said. While Google says its involvement in Project Maven is not related to combat uses, the issue has still sparked concern among employees, sources said

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven's stated mission is to "accelerate DoD's integration of big data and machine learning." In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

Are the employees at Google starting to become a problem for Google and their eventual bottom line with their political agendas? Are they getting in the way of doing actual work? When or if is there such a line?

https://gizmodo.com/google-is-helping-the-pentagon-build-ai-for-drones-1823464533


Original Submission

Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War" 65 comments

We had submissions from two Soylentils concerning recent employee reaction to Google's participation in the Pentagon's "Project Maven" program:

Google Workers Urge C.E.O. to Pull Out of Pentagon A.I. Project

Submitted via IRC for fyngyrz

Thousands of Google employees, including dozens of senior engineers, have signed a letter protesting the company's involvement in a Pentagon program that uses artificial intelligence to interpret video imagery and could be used to improve the targeting of drone strikes.

The letter [pdf], which is circulating inside Google and has garnered more than 3,100 signatures, reflects a culture clash between Silicon Valley and the federal government that is likely to intensify as cutting-edge artificial intelligence is increasingly employed for military purposes.

Source: https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html

Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"

Thousands of Google employees have signed a letter protesting the development of "Project Maven", which would use machine learning algorithms to analyze footage from U.S. military drones:

About a Dozen Google Employees Have Resigned Over Project Maven 70 comments

Google Employees Resign in Protest Against Pentagon Contract

It's been nearly three months since many Google employees—and the public—learned about the company's decision to provide artificial intelligence to a controversial military pilot program known as Project Maven, which aims to speed up analysis of drone footage by automatically classifying images of objects and people. Now, about a dozen Google employees are resigning in protest over the company's continued involvement in Maven.

[...] The employees who are resigning in protest, several of whom discussed their decision to leave with Gizmodo, say that executives have become less transparent with their workforce about controversial business decisions and seem less interested in listening to workers' objections than they once did. In the case of Maven, Google is helping the Defense Department implement machine learning to classify images gathered by drones. But some employees believe humans, not algorithms, should be responsible for this sensitive and potentially lethal work—and that Google shouldn't be involved in military work at all.

Previously: Google vs Maven
Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"


Original Submission

Microsoft Misrepresented HoloLens 2 Field of View, Faces Backlash for Military Contract 39 comments

Microsoft Significantly Misrepresented HoloLens 2's Field of View at Reveal

To significant anticipation, Microsoft revealed HoloLens 2 earlier this week at MWC 2019. By all accounts it looks like a beautiful and functional piece of technology and a big step forward for Microsoft's AR initiative. All of which makes it unfortunate that the company didn't strive to be clearer when illustrating one of the three key areas in which the headset is said to be improved over its predecessor. [...] For field of view—how much of your view is covered by the headset's display—[Alex] Kipman said that HoloLens 2 delivers "more than double" the field of view of the original HoloLens.

Within the AR and VR markets, the de facto descriptor used when talking about a headset's field of view is an angle specified to be the horizontal, vertical, or diagonal extent of the device's display from the perspective of the viewer. When I hear that one headset has "more than double" the field of view of another, it says to me that one of those angles has increased by a factor of ~2. It isn't perfect by any means, but it's how the industry has come to define field of view.

It turns out that's not what Kipman meant when he said "more than double." I reached out to Microsoft for clarity and found that what he was actually referring to was not a field of view angle, rather the field of view area, but that wasn't explained in the presentation at all, just (seemingly intentionally) vague statements of "more than twice the field of view."

[...] But then Kipman moved onto a part of the presentation which visually showed the difference between the field of view of HoloLens 1 and HoloLens 2, and that's when things really became misleading.

Microsoft chief defends controversial military HoloLens contract

Microsoft employees objecting to a US Army HoloLens contract aren't likely to get many concessions from their company's leadership. CEO Satya Nadella has defended the deal in a CNN interview, arguing that Microsoft made a "principled decision" not to deny technology to "institutions that we have elected in democracies to protect the freedoms we enjoy." The exec also asserted that Microsoft was "very transparent" when securing the contract and would "continue to have that dialogue" with staff.

Also at UploadVR, Ars Technica, and The Hill.

See also: Stick to Your Guns, Microsoft

Previously: U.S. Army Awards Microsoft a $480 Million HoloLens Contract
Microsoft Announces $3,500 HoloLens 2 With Wider Field of View and Other Improvements

Related: Google Drafting Ethics Policy for its Involvement in Military Projects
Google Will Not Continue Project Maven After Contract Expires in 2019


Original Submission

Is Ethical A.I. Even Possible? 35 comments

Is Ethical A.I. Even Possible?

When a news article revealed that Clarifai was working with the Pentagon and some employees questioned the ethics of building artificial intelligence that analyzed video captured by drones, the company said the project would save the lives of civilians and soldiers.

"Clarifai's mission is to accelerate the progress of humanity with continually improving A.I.," read a blog post from Matt Zeiler, the company's founder and chief executive, and a prominent A.I. researcher. Later, in a news media interview, Mr. Zeiler announced a new management position that would ensure all company projects were ethically sound.

As activists, researchers, and journalists voice concerns over the rise of artificial intelligence, warning against biased, deceptive and malicious applications, the companies building this technology are responding. From tech giants like Google and Microsoft to scrappy A.I. start-ups, many are creating corporate principles meant to ensure their systems are designed and deployed in an ethical way. Some set up ethics officers or review boards to oversee these principles.

But tensions continue to rise as some question whether these promises will ultimately be kept. Companies can change course. Idealism can bow to financial pressure. Some activists — and even some companies — are beginning to argue that the only way to ensure ethical practices is through government regulation.

"We don't want to see a commercial race to the bottom," Brad Smith, Microsoft's president and chief legal officer, said at the New Work Summit in Half Moon Bay, Calif., hosted last week by The New York Times. "Law is needed."

Possible != Probable. And the "needed law" could come in the form of a ban and/or surveillance of coding and hardware-building activities.

Related:


Original Submission

Pentagon Brass Bafflingly Accuses Google of Providing "Direct Benefit" to China's Military 37 comments

Submitted via IRC for soysheep9857

There are many reasons to be critical of Google. But on Thursday, General Joseph Dunford, chairman of the Joint Chiefs of Staff, stopped just short of accusing the tech giant of treason.

Dunford's incendiary comments came during a budgetary hearing by the Senate Armed Services Committee this afternoon. During his time for questioning, freshman Senator Josh Hawley, a Republican, turned to the subject of Google's decision to back away from projects with the Pentagon. Hawley asked the panel if he understood the situation correctly and that the men were saying, "that Google, an American company, supposedly, is refusing to work with the Department of Defense, but is doing work with China, in China, in a way that at least indirectly benefits the Chinese government."

Acting Defense Secretary Patrick Shanahan tempered that assertion, explaining that he hasn't heard anyone use the word "refuse," but that Google has shown "a lack of willingness to support DOD programs."

But General Dunford was more open to going on the attack. When given the chance to elaborate on his concerns, he told Senator Hawley:

You know, senator, I'm nodding my head on exactly the point that you made: that the work that Google is doing in China is indirectly benefitting the Chinese military. And I've been very public on this issue as well; in fact, the way I described it to our industry partners is, 'look we're the good guys in the values that we represent and the system that we represent is the one that will allow and has allowed you to thrive,' and that's the way I've characterized it. I was just nodding that what the secretary was articulating is the general sense of all of us as leaders. We watch with great concern when industry partners work in China knowing there is that indirect benefit, and frankly 'indirect' may be not a full characterization of the way it really is. It's more of a direct benefit to the Chinese military.

Source: https://gizmodo.com/pentagon-brass-bafflingly-accuses-google-of-providing-d-1833302885

Related: Google Employees on Pentagon AI Algorithms: "Google Should Not be in the Business of War"
About a Dozen Google Employees Have Resigned Over Project Maven
Google Drafting Ethics Policy for its Involvement in Military Projects
Google Will Not Continue Project Maven After Contract Expires in 2019
Microsoft Misrepresented HoloLens 2 Field of View, Faces Backlash for Military Contract


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: -1, Flamebait) by Anonymous Coward on Thursday May 31 2018, @08:00AM (3 children)

    by Anonymous Coward on Thursday May 31 2018, @08:00AM (#686636)

    When you are in the USA, you support your country. Maybe, sometimes, a few allies: NATO, Five Eyes, Japan, South Korea, possibly even Israel or Taiwan... but that's pushing it. Really, you support the USA.

    Google should do self-driving for military vehicles, mapping support for the NRO, data collection for the NSA, covert communication for the CIA, etc.

    BTW, from personal experience, it is damn satisfying to work for a company that serves the USA. My company makes bombs. Fuck yeah, `MURICA!!!! We even make a gun turret that automatically aims and fires without human involvement, letting lose a spray of 20mm shells that are tracked by radar to help fine-tune the targeting. We also make a beam weapon that is, if you don't crank the power up, less exciting than when you do. We pwn everything. We make shoulder-fired missiles that can do a top-down attack. Maybe we even make landmines. You know, there exist landmines that create a mesh network and then hop around to fill in any gaps created due to mineclearing operations. Little fuckers are scary. :-)

    • (Score: 1, Touché) by Anonymous Coward on Thursday May 31 2018, @08:50AM

      by Anonymous Coward on Thursday May 31 2018, @08:50AM (#686644)

      data collection for the NSA

      Indeed, what's more patriotic than conducting unconstitutional mass surveillance on the populace?

    • (Score: 2) by c0lo on Thursday May 31 2018, @09:55AM

      by c0lo (156) Subscriber Badge on Thursday May 31 2018, @09:55AM (#686650) Journal

      When you are in the USA, you support your country.

      Until your country equips its police with the very deadly stuff the Army is using.

      Then, wrong place at the wrong time, your luck may make you unable to support anything at all in the 'one with the environment' state you may find yourself.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 1, Interesting) by Anonymous Coward on Thursday May 31 2018, @02:17PM

      by Anonymous Coward on Thursday May 31 2018, @02:17PM (#686739)

      You may feel all fuzzy about this kind of stuff, but I think it is disgusting, and I wouldn't want to work with it. And demand for my skills is high, so I won't starve due to my choice. Many googlers are in the same position.

  • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @08:10AM (3 children)

    by Anonymous Coward on Thursday May 31 2018, @08:10AM (#686637)

    Don't be evil fire until you see the whites of their eyes!

    American cowards, first they send the poorest members of their society to fight, the expendables, and now not even that, the machines, the extendables. If only they had the vision to send the deplorables. What? Five deferments for "bone spurs"? Cheney got a better story than that with the anal cysts. Or was that Limberger? And Ted Nugent, "Cat scratch Poopy Pants!" . America is domed.

    • (Score: 2) by takyon on Thursday May 31 2018, @08:38AM (1 child)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Thursday May 31 2018, @08:38AM (#686641) Journal

      Drones and mechs, delivering swift autonomous death to our poorer enemies! Get with the program already!

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 4, Insightful) by c0lo on Thursday May 31 2018, @08:51AM

        by c0lo (156) Subscriber Badge on Thursday May 31 2018, @08:51AM (#686645) Journal

        Drones and mechs, delivering swift autonomous death to our poorer enemies!

        That's fun and dandy, I suppose.
        Until the Army start selling its surplus to law enforcement (all legal, of couse [nytimes.com]), at which point the US citizens will start looking at the wrong end of the tech.

        But yeah, the MIC must never stop showing profits, ethics be damned!

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @09:37PM

      by Anonymous Coward on Thursday May 31 2018, @09:37PM (#686940)

      Do you know how much it costs to train a human to fight? Then we have to support any medical bills through the VA, and after that we give them pensions if they stick it out long enough! This is financially KILLING us!!! Automated weaponry will reduce the financial burden of the war machine along with eliminate those pesky whistle blowers as well. LONG LIVE THE DEATH MACHINE!

  • (Score: 1) by anubi on Thursday May 31 2018, @10:26AM (2 children)

    by anubi (2828) on Thursday May 31 2018, @10:26AM (#686654) Journal

    Where do we draw the line between Adaptive Filtering technology and Artificial Intelligence?

    It all starts in beginning engineering... known as the "predictor-corrector" algorithm, which "learns" from a past try to predict the next set of control parameters.

    This is not new technology... I believe every QAM modem uses it to learn line characteristics, especially things like echo.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 0) by Anonymous Coward on Thursday May 31 2018, @11:20AM (1 child)

      by Anonymous Coward on Thursday May 31 2018, @11:20AM (#686672)

      All software is AI (according to today's marketing/definitions). The military invented software, the first computer (both electrical and mechanical) where developed by/for the military...even the algorithms they ran (cannon heating and projectile accuracy) were developed for the military. Drones have been in use for over 100 years.

      All these things are tools.

      Tools are immoral inanimate objects like rocks until someone picks one up and throws it.

      • (Score: 1) by anubi on Friday June 01 2018, @12:02PM

        by anubi (2828) on Friday June 01 2018, @12:02PM (#687195) Journal

        If a computer is involved, there is likely a "hold harmless" clause in there somewhere, and how do you invalidate one guy's noose without invalidating everyone's noose?

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 3, Insightful) by The Mighty Buzzard on Thursday May 31 2018, @10:34AM (6 children)

    Correct me if I'm wrong here but wouldn't analyzing surveillance footage reduce civilian casualties in war by letting the government do a better job of only hitting actual combatants? I guess that's not something the regressives care about these days if it interferes with their kneejerk hating of their country.

    --
    My rights don't end where your fear begins.
    • (Score: 2) by c0lo on Thursday May 31 2018, @10:59AM (2 children)

      by c0lo (156) Subscriber Badge on Thursday May 31 2018, @10:59AM (#686665) Journal

      I guess that's not something the regressives care about these days if it interferes with their kneejerk hating of their country.

      1. If I'm not hasting my country, am I allowed to bitch about yours without being call regressive?

      2. do you see your comment as an example of progressivism? Or at least not a regressive one?

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 0, Troll) by The Mighty Buzzard on Thursday May 31 2018, @01:55PM (1 child)

        1) Sure. We can just call you a commie instead.

        2) I don't concern myself with whether my stances are progressive or conservative. I only care if they're the stances that need to be taken. Labels do not denote correctness on individual issues. Picking one label and adjusting all your positions to it is the best way I know to be absolutely certain you're wrong on nearly every issue.

        --
        My rights don't end where your fear begins.
        • (Score: 3, Touché) by c0lo on Thursday May 31 2018, @02:49PM

          by c0lo (156) Subscriber Badge on Thursday May 31 2018, @02:49PM (#686756) Journal

          1) Sure. We can just call you a commie instead.

          Fair enough, you rotten to the core decadent capitalist.
          Remember, the capitalism is on the brink of the precipice. Communism is a step forward

          I only care if they're the stances that need to be taken.

          Uh, oh. Because if they are not taken, then... what?
          You reckon Google won't dare to create war-AI tech without the public display of your moral support (and anonymized financial support via taxes)?

          (large grin)

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 2) by aristarchus on Thursday May 31 2018, @07:27PM (2 children)

      by aristarchus (2645) on Thursday May 31 2018, @07:27PM (#686883) Journal

      a better job of only hitting actual combatants?

      Fine, except there is no legally declared state of war in the world presently, and thus there are no "actual" combatants. There are only victims of extra-judicial killings. Being involved in facilitating that is against the CoC of humans.

      • (Score: 1, Interesting) by Anonymous Coward on Thursday May 31 2018, @09:47PM (1 child)

        by Anonymous Coward on Thursday May 31 2018, @09:47PM (#686942)

        He's a carrion eating bird for fucks sake, he WANTS more human deaths, he THRIVES on horror.

  • (Score: 3, Insightful) by MostCynical on Thursday May 31 2018, @10:40AM (2 children)

    by MostCynical (2589) on Thursday May 31 2018, @10:40AM (#686657) Journal

    With apps and data, security is either there from the beginning, or missing. No amount of patching ever gives you 100%.

    Same with ethics.

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 2) by c0lo on Thursday May 31 2018, @11:04AM (1 child)

      by c0lo (156) Subscriber Badge on Thursday May 31 2018, @11:04AM (#686666) Journal

      With apps and data, security is either there from the beginning, or missing. No amount of patching ever gives you 100%.

      Same with ethics.

      Ummm... let me try to de-reference the similitude suggested at the end of your comment, see if I got it right:
      "With ethics, security is either there from the beginning, or missing."

      (grin)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by MostCynical on Thursday May 31 2018, @11:20AM

        by MostCynical (2589) on Thursday May 31 2018, @11:20AM (#686671) Journal

        Google might have locked any ethics in a very secure cabinet, but instead they quietly had them put down, stuffed, and mounted in a break room as a reminder to google employees who might be feeling twinges of conscience.

        --
        "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
  • (Score: 1) by Improbus on Thursday May 31 2018, @05:34PM (3 children)

    by Improbus (6425) on Thursday May 31 2018, @05:34PM (#686839)

    Why bother to write one if you don't mean it in the first place. Just so you can say you have one?

    • (Score: 3, Insightful) by aristarchus on Thursday May 31 2018, @07:32PM (2 children)

      by aristarchus (2645) on Thursday May 31 2018, @07:32PM (#686885) Journal

      Why bother to write one if you don't mean it in the first place. Just so you can say you have one?

      It's called "compliance". From one Bank's Code:

      This Financial Code of Ethics has been adopted to comply with Section 406 of the Sarbanes-. Oxley Act of 2002.

      You see, without an adopted Code of Ethics, the fines will be stiffer when you get caught. So better to say you have one.

      • (Score: 2) by Hartree on Thursday May 31 2018, @10:11PM (1 child)

        by Hartree (195) on Thursday May 31 2018, @10:11PM (#686953)

        This code has no such effect in this case. Section 406 deals with complying with US law. It doesn't say anything about military contracting or weaponized AI development for the US government, both of which are legal (whether or not someone thinks it should be).

        From Section 406:
        "This Code has been adopted for the purpose of promoting:

        honest and ethical conduct, including the ethical handling of actual or apparent conflicts of interest between personal and professional relationships;

        full, fair, accurate, timely and understandable disclosure in reports and documents that a Fund files with, or submits to, the Securities and Exchange Commission (“SEC”) and in other public communications made by a Fund;

        compliance with applicable laws and governmental rules and regulations;

        the prompt internal reporting of violations of the Code to an appropriate person or persons identified in the Code; and

        accountability for adherence to the Code. "

        Has nada about this issue.

        • (Score: 1, Offtopic) by aristarchus on Thursday May 31 2018, @11:41PM

          by aristarchus (2645) on Thursday May 31 2018, @11:41PM (#686986) Journal

          Are you sure, or are you just saying this, so that you have something to say? What was the question, again?

  • (Score: 2) by Hartree on Thursday May 31 2018, @09:53PM

    by Hartree (195) on Thursday May 31 2018, @09:53PM (#686944)

    Can't blame anything on those googlers. Their company has a policy that says they won't do evi... Oh wait. They removed that. We meant we won't do bad things for the military. Just don't notice the cloud computing contracts we're bidding on for DOD and the intel agencies. It's just computing. We can't know what they're doing with it. It's all... Cute kitten pics. Yeah, that's it. And we certainly wouldn't use AI for combat. We just make AI to play extremely realistic combat simulatio... I mean games! We certainly can't control what others might do with our completely innocent creations, like hooking them up to a remote control firing station mounted on an autonomous vehicle.

    This is why in a previous article on Soylent I referred to it as peace washing or preaching celibacy in a brothel.

    I understand why people might have reservations about doing such work. I personally don't (I was in the military for a dozen years). It's a moral questions that people have to decide for themselves and I don't minimize it. But this smacks to me of trying to have it both ways by both working for what is effectively a military contractor but still being able to rationalize it away because of a "policy".

(1)