Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday April 25 2018, @12:36PM   Printer-friendly
from the learn-to-love-the-bomb dept.

A new RAND Corporation paper finds that artificial intelligence has the potential to upend the foundations of nuclear deterrence by the year 2040.

While AI-controlled doomsday machines are considered unlikely, the hazards of artificial intelligence for nuclear security lie instead in its potential to encourage humans to take potentially apocalyptic risks, according to the paper.

During the Cold War, the condition of mutual assured destruction maintained an uneasy peace between the superpowers by ensuring that any attack would be met by a devastating retaliation. Mutual assured destruction thereby encouraged strategic stability by reducing the incentives for either country to take actions that might escalate into a nuclear war.

The new RAND publication says that in coming decades, artificial intelligence has the potential to erode the condition of mutual assured destruction and undermine strategic stability. Improved sensor technologies could introduce the possibility that retaliatory forces such as submarine and mobile missiles could be targeted and destroyed. Nations may be tempted to pursue first-strike capabilities as a means of gaining bargaining leverage over their rivals even if they have no intention of carrying out an attack, researchers say. This undermines strategic stability because even if the state possessing these capabilities has no intention of using them, the adversary cannot be sure of that.

"The connection between nuclear war and artificial intelligence is not new, in fact the two have an intertwined history," said Edward Geist, co-author on the paper and associate policy researcher at the RAND Corporation, a nonprofit, nonpartisan research organization. "Much of the early development of AI was done in support of military efforts or with military objectives in mind."

[...] Under fortuitous circumstances, artificial intelligence also could enhance strategic stability by improving accuracy in intelligence collection and analysis, according to the paper. While AI might increase the vulnerability of second-strike forces, improved analytics for monitoring and interpreting adversary actions could reduce miscalculation or misinterpretation that could lead to unintended escalation.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by Grishnakh on Wednesday April 25 2018, @07:02PM

    by Grishnakh (2831) on Wednesday April 25 2018, @07:02PM (#671787)

    No, it's not doom. Sure, you might have some nasty resource wars, and you'll probably have giant famines with millions or even billions dead, but think about this: what happens if 6 billion people died tomorrow? We'd still have over 1.5 billion people. That's not extinction, it's just a massive shift in civilization. Humans have gone through that before, and didn't go extinct.

    I'm not trying to minimize the consequences of climate change, I'm just pointing out that it's extremely unlikely to result in human extinction (unless you're predicting it'll lead to massive nuclear war). It might wind up looking like some horribly dystopic sci-fi where much of the population is dead and the survivors are living in walled-off areas to protect themselves from zombies or whatever, but that still is not extinction; humanity can bounce back from that1. Even Star Trek's official history had humans going through a horrible WWIII which presumably wiped out a lot of the population before Zephram Cochrane invented the continuum distortion drive and met the Vulcans. Climate change by itself can't kill us all off.

    As for water supplies, I'm not sure where you're getting the idea that the earth will turn into a desert. There'll still be plenty of water (most of the planet is covered in it), and even freshwater isn't going anywhere as long as there's evaporation, clouds, rainfall, etc. Where the water is located could certainly change, though, rendering much of our existing hydro infrastructure useless, and this could have catastrophic results for many places dependent on it. But that's not going to wipe out every human on the planet. A genetically-engineered virus created by a rogue scientist, however, really could. Easily-built fusion bombs probably won't, but they could potentially wipe out so many that civilization collapses and the survivors are unable to rebuild (I don't think global climate change will have such a dramatic effect so quickly that this would happen; it'll be slower and people will adapt).

    Starting Score:    1  point
    Moderation   +3  
       Insightful=1, Interesting=2, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5