Stories
Slash Boxes
Comments

SoylentNews is people

posted by azrael on Thursday November 13 2014, @08:26AM   Printer-friendly
from the reverse-polarity dept.

A United Nations commission is meeting in Geneva, Switzerland today to begin discussions on placing controls on the development of weapons systems that can target and kill without the intervention of humans, the New York Times reports. The discussions come a year after a UN Human Rights Council report called for a ban (pdf) on “Lethal autonomous robotics” and as some scientists express concerns that artificially intelligent weapons could potentially make the wrong decisions about who to kill.

SpaceX and Tesla founder Elon Musk recently called artificial intelligence potentially more dangerous than nuclear weapons.

Peter Asaro, the cofounder of the International Committee for Robot Arms Control (ICRAC), told the Times, “Our concern is with how the targets are determined, and more importantly, who determines them—are these human-designated targets? Or are these systems automatically deciding what is a target?”

Intelligent weapons systems are intended to reduce the risk to both innocent bystanders and friendly troops, focusing their lethality on carefully—albeit artificially—chosen targets. The technology in development now could allow unmanned aircraft and missile systems to avoid and evade detection, identify a specific target from among a clutter of others, and destroy it without communicating with the humans who launched them.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by takyon on Thursday November 13 2014, @09:42AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday November 13 2014, @09:42AM (#115490) Journal

    Forget autonomous drones.

    Do existing remote-controlled lethal drones with human-designated targets comply with international law?

    Do drone attacks comply with international law? [politifact.com]

    Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism [ohchr.org]

    Resolution adopted by the Human Rights Council. 25/22. Ensuring use of remotely piloted aircraft or armed drones in counterterrorism and military operations in accordance with international law, including international human rights and humanitarian law [un.org]

    1. Urges all States to ensure that any measures employed to counter terrorism, including the use of remotely piloted aircraft or armed drones, comply with their obligations under international law, including the Charter of the United Nations, international human rights law and international humanitarian law, in particular the principles of precaution, distinction and proportionality;

    United States of America Practice Relating to Rule 14. Proportionality in Attack [icrc.org]

    The US Air Force Commander’s Handbook (1980) states that “a weapon is not unlawful simply because its use may cause incidental or collateral casualties to civilians, as long as those casualties are not foreseeably excessive in light of the expected military advantage”.

    The US Instructor’s Guide (1985) states: "In attacking a military target, the amount of suffering or destruction must be held to the minimum necessary to accomplish the mission. Any excessive destruction or suffering not required to accomplish the objective is illegal as a violation of the law of war."

    The US Naval Handbook (2007) states: "The principle of proportionality is directly linked to the principle of distinction. While distinction is concerned with focusing the scope and means of attack so as to cause the least amount of damage to protected persons and property, proportionality is concerned with weighing the military advantage one expects to gain against the unavoidable and incidental loss to civilians and civilian property that will result from the attack. The principle of proportionality requires the commander to conduct a balancing test to determine if the incidental injury, including death to civilians and damage to civilian objects, is excessive in relation to the concrete and direct military advantage expected to be gained. Note that the principle of proportionality under the law of armed conflict is different than the term proportionality as used in self-defense. It is not unlawful to cause incidental injury to civilians, or collateral damage to civilian objects, during an attack upon a legitimate military objective. The principle of proportionality requires that the anticipated incidental injury or collateral damage must not, however, be excessive in light of the military advantage expected to be gained. A military objective within a city, town, or village may, however, be bombarded if required for the submission of the enemy with the minimum expenditure of time, life, and physical resources. The anticipated incidental injury to civilians, or collateral damage to civilian objects, must not be excessive in light of the military advantage anticipated by the attack."

    Drones pose no risk (other than PTSD [nytimes.com]) to the life of the drone operator. It is known that drone bombings, however precisely targeted they are claimed to be, inevitably result in civilian casualties [theguardian.com]. But drones can act in ways that would be risky for human-piloted aircraft. They can linger or hover in dangerous areas without risking the life of a pilot, and they may be less noisy and more stealthy due to their lack of a cockpit. Therefore, drones that kill are convenient, yet not necessarily proportional, given that they may be engineered to subdue targets using nonlethal capabilities at no risk to a nation's soldiers and pilots.

    What kinds of nonlethal capabilities? Rubber bullets or chemical weapons are a possibility [texasobserver.org], but probably not very effective at stopping targets from fleeing. Sticky foam [defensetech.org] could make a comeback given some R&D. A flying Active Denial System [wikipedia.org] (microwave weapon) probably wouldn't work due to size and power requirements. Sonic weaponry [wikipedia.org]? Better to burst a target's eardrums than kill them.

    What do you do with an incapacitated target? You work with the host country. We've conducted strikes in Pakistan, Yemen, and recently "Iraq" again. They can provide boots on the ground to pick up the targets. If that can't be done, perhaps it's time to end U.S. involvement in these conflict zones.

    Good luck convincing the Obama administration or a future neoconservative administration [nytimes.com] to ditch the $50k+ Hellfire missiles and convenient lethal drone strike capabilities that don't necessarily make America safe [nytimes.com].

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1) by BK on Thursday November 13 2014, @04:46PM

    by BK (4868) on Thursday November 13 2014, @04:46PM (#115591)

    Forget autonomous drones.

    Do existing remote-controlled lethal drones with human-designated targets comply with international law?

    ...yet not necessarily proportional, given that they may be engineered...

    You seem to be suggesting that the use of remote controlled weapon systems are against "international law" because it might be possible for someone, somewhere, someday, to design and use one was that would be better [youtube.com]/more accurate/more precise/less lethal/painted with rainbows. I mean logically, that's great I guess... but taken to its natural conclusion you are suggesting that virtually everything, or at least every weapons system, is against international law because it could be better in some way.

    I don't think that international law works like that.

    --
    ...but you HAVE heard of me.
  • (Score: 1) by khallow on Thursday November 13 2014, @04:50PM

    by khallow (3766) Subscriber Badge on Thursday November 13 2014, @04:50PM (#115593) Journal
    That's a pretty thin argument. Nonlethal weapons and those "boots on the ground" are both notoriously unreliable. I think the real problem here is the lack of accountability due to the secrecy of the operations. For all I know, those drone strikes protect someone's heroin operation.