Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by takyon on Tuesday July 28 2015, @01:00AM   Printer-friendly
from the the-skynet-is-falling dept.

Over 1,000 high-profile artificial intelligence experts and leading researchers have signed an open letter warning of a "military artificial intelligence arms race" and calling for a ban on "offensive autonomous weapons".

The letter, presented at the International Joint Conference on Artificial Intelligence in Buenos Aires, Argentina, was signed by Tesla's Elon Musk, Apple co-founder Steve Wozniak, Google DeepMind chief executive Demis Hassabis and professor Stephen Hawking along with 1,000 AI and robotics researchers.

The letter states: "AI technology has reached a point where the deployment of [autonomous weapons] is – practically if not legally – feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."

So, spell it out for me, Einstein, are we looking at a Terminator future or a Matrix future?

While the latest open letter is concerned specifically with allowing lethal machines to kill without human intervention, several big names in the tech world have offered words of caution of the subject of machine intelligence in recent times. Earlier this year Microsoft's Bill Gates said he was "concerned about super intelligence," while last May physicist Stephen Hawking voiced questions over whether artificial intelligence could be controlled in the long-term. Several weeks ago a video surfaced of a drone that appeared to have been equipped to carry and fire a handgun.

takyon: Counterpoint - Musk, Hawking, Woz: Ban KILLER ROBOTS before WE ALL DIE


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday July 28 2015, @10:41AM

    by Anonymous Coward on Tuesday July 28 2015, @10:41AM (#214784)

    The danger of autonomous weapons is not, at least at present, that they will achieve sentience and "go Terminator" on us - If that is possible (which I doubt), a lot of technological water needs to pass under the bridge first.
    A much more immediate risk is that they will make it seem safer to go to war: "One day there will be no human soldiers - robots will fight robots in a war and we can all derive
    great and guilt-free entertainment from it as the ideal sport!". The problem with this cozy thought is that in any prolonged modern war, the command centers and logistics facilities supporting the war effort (factories which produce weapons and ammunition, power plants, and so on) are primary targets because they remove your enemy's ability to resupply and reinforce his armed forces, and for the forseeable future, these will involve some human beings. Then there is the question of what happens to a country which has its robot army and the infrastructure they depend on destroyed. They could surrender, but probably won't, probably instead choosing to field human soldiers, who will now be fighting against robots. So much for robotic war lacking human casualties. See Philip K. Dick's short story "Autofac" for an alternative view of the dangers of robot-on-robot warfare where the factories have been automated.
    But the most immediate problem with autonomous weapons systems is that they provide unscrupulous military and political leaders with near-perfect plausible deniability: "the weapon system malfunctioned due to a firmware bug, and mistook that passenger jet for a bogy - sorry your prime minister happened to be on it". Or "Unknown hackers took penetrated our security and took control of an armed military drone, shooting an anti-tank missile at a road vehicle. The victims have not yet been identified, but the vehicle was owned by a senator who wished to cut military expenditure". You get the idea.

  • (Score: 2) by maxwell demon on Tuesday July 28 2015, @09:24PM

    by maxwell demon (1608) on Tuesday July 28 2015, @09:24PM (#215066) Journal

    For the latter scenario, it doesn't need an AI. Indeed, a "dumb drone" is probably easier to hack and take over than an intelligent one.

    --
    The Tao of math: The numbers you can count are not the real numbers.