Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by janrinok on Wednesday April 25 2018, @12:36PM   Printer-friendly
from the learn-to-love-the-bomb dept.

A new RAND Corporation paper finds that artificial intelligence has the potential to upend the foundations of nuclear deterrence by the year 2040.

While AI-controlled doomsday machines are considered unlikely, the hazards of artificial intelligence for nuclear security lie instead in its potential to encourage humans to take potentially apocalyptic risks, according to the paper.

During the Cold War, the condition of mutual assured destruction maintained an uneasy peace between the superpowers by ensuring that any attack would be met by a devastating retaliation. Mutual assured destruction thereby encouraged strategic stability by reducing the incentives for either country to take actions that might escalate into a nuclear war.

The new RAND publication says that in coming decades, artificial intelligence has the potential to erode the condition of mutual assured destruction and undermine strategic stability. Improved sensor technologies could introduce the possibility that retaliatory forces such as submarine and mobile missiles could be targeted and destroyed. Nations may be tempted to pursue first-strike capabilities as a means of gaining bargaining leverage over their rivals even if they have no intention of carrying out an attack, researchers say. This undermines strategic stability because even if the state possessing these capabilities has no intention of using them, the adversary cannot be sure of that.

"The connection between nuclear war and artificial intelligence is not new, in fact the two have an intertwined history," said Edward Geist, co-author on the paper and associate policy researcher at the RAND Corporation, a nonprofit, nonpartisan research organization. "Much of the early development of AI was done in support of military efforts or with military objectives in mind."

[...] Under fortuitous circumstances, artificial intelligence also could enhance strategic stability by improving accuracy in intelligence collection and analysis, according to the paper. While AI might increase the vulnerability of second-strike forces, improved analytics for monitoring and interpreting adversary actions could reduce miscalculation or misinterpretation that could lead to unintended escalation.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by VLM on Wednesday April 25 2018, @03:28PM (3 children)

    by VLM (445) on Wednesday April 25 2018, @03:28PM (#671653)

    Oh and in my nuclear war game, I forgot to express the odds:

    Without AI, the odds of winning a nuclear war that you start are 2:6 or one third, or the odds are two thirds you'll lose, so the odds of a war starting by either player are roughly zero because who ever starts a war has 2/3 chance of losing it.

    With AI on the turn AI is discovered the odds remain 1/3 of winning for both sides against a random dice, so the opposing non-AI player starts a war that turn and wins 1/3 of the time AKA the dude who discovers AI wins 2/3 of the time.

    With AI on later turns, we'll assume AI is discovered early enough that the AI player has at least a single hidden "1" die meaning his odds of winning are 5/6, so if the non-AI player is a crazy suicidal pacifist and doesn't immediately start a war, the pacifist has 5/6 odds of dying in a nuclear war vs if they immediately launch they only have 4/6 odds of dying in a nuclear war.

    Leading to the peculiar situation where the behavior least likely to kill yourself and your nation, by a ratio of 4/6 vs 5/6, is to start a nuclear war the instant its discovered or strongly believed the other side has discovered AI.

    There are other implications. A player that decides to invest in AI raises the odds of nuclear war from 0% to 100% but has unchanged odds of winning that nuclear war from 1/3 before discovery to 2/3 the turn of AI discovery when the other side launches, and post discovery if they're playing against a pacifist, their odds of winning increase from 1/3 before discovery to 5/6 after discovery. Inventing AI means 100% odds of nuclear war, but you individual odds of winning that war increase from 1/3 to either 2/3 against a rational opfor or 5/6 against a crazy pacifist. So there's a strong motivator to research AI by rolling that D20.

    Essentially the game simplifies down to you're rolling a D20 and when someone rolls a natural 20 discovering AI, they have a 2/3 chance of winning the game when their opfor rolls a D3. And there's enough turns (the remainder of time?) such that the odds of rolling a natural 20 are 100% eventually.

    Of course the gains from discovering that AI secretly are so high that even if you signed a treaty with the other player you'd be an idiot not to roll dice in secret even if it takes 100 times longer than rolling in public. This is not a problem that can be solved by treaties, all that can do is kick the can down the road, with a treaty and secret private rolling that takes 100 times longer that just means the 100% odds of nuclear war take on average 100 times longer to start, but the side that discovers first always wins the war 2/3 of the time.

    One semi-realistic treaty-ish way to survive is changing the AI discovery from sudden step function to tit-for-tat (which has nothing to do with mardi gras in New Orleans although it should) and a deep analysis of that will take longer than consuming this cup of freshly brewed black tea took me. Intuitively it seems a nice smooth linear function where both sides slowly slope up from no AI at all to gradually achieving perfect Harry Potter magic accurate AI would not result in warfare... or would it? I suppose if you knew the opfor had a 5/6 chance of winning you could create treaty obligations far outside the bounds of the game such that whenever a side saw its odds were above 3:6 it had to declare a federal holiday and give all the missile crews the day off, such that half of the time per treaty obligations your missiles are down. Which ironically out of the game would probably cause WWIII because that would be a great time to invade Europe going either west or east, probably leading eventually to nuclear escalation at a later time. So WWIII would be declared when one opfor has two "1" die in a row as proven by AI. OR again you're better off starting early so two turns before opfor has double "1" dies is when you roll the tanks in Europe... hmm gotta run the odds on that one.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by DannyB on Wednesday April 25 2018, @06:15PM (2 children)

    by DannyB (5839) Subscriber Badge on Wednesday April 25 2018, @06:15PM (#671756) Journal

    who ever starts a war has 2/3 chance of losing it.

    Assumes sanity of person who is able to start a nuclear war.

    Assumes a condition where two differently crazy world leaders might be able to escalate a situation until it becomes a nuclear war. While the two leaders don't even understand what is happening as it gets out of control.

    One day people of planet Earth might put such people into power.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 3, Insightful) by Azuma Hazuki on Wednesday April 25 2018, @08:38PM (1 child)

      by Azuma Hazuki (5086) on Wednesday April 25 2018, @08:38PM (#671845) Journal

      Hate to break this to you, but...Iran, North Korea, Russia, the United States...it's already happened.

      --
      I am "that girl" your mother warned you about...
      • (Score: 2) by DannyB on Wednesday April 25 2018, @08:42PM

        by DannyB (5839) Subscriber Badge on Wednesday April 25 2018, @08:42PM (#671847) Journal

        Yes. I am uncontrollably involuntarily sarcastic.

        --
        To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.