Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday April 24 2018, @07:49PM   Printer-friendly
from the not-sure-that-threats-will-work dept.

President Rouhani warns that White House failure to uphold Iran nuclear deal would prompt firm reaction from Tehran.

Iranian President Hassan Rouhani has called on US President Donald Trump to uphold the 2015 nuclear deal between Iran and six world powers, or "face severe consequences". 

In a televised speech, Rouhani said the "Iranian government will react firmly" if the White House fails to "live up to their commitments" under the agreement. 

The warning comes weeks in advance of a May 12 deadline for Trump to renew the deal.

The US president has previously said he would scrap the Joint Comprehensive Plan of Action (JCPOA), which he has called the "worst deal in history", unless "a better option" is presented to him. 

[...] The landmark deal reached in Lausanne, Switzerland in April 2015 with China, Russia, France, Great Britain, Germany and the US offered Iran more than $110bn a year in sanctions relief and a return to the global economy in exchange for halting its drive for a nuclear weapon.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by JNCF on Wednesday April 25 2018, @12:11AM (4 children)

    by JNCF (4317) on Wednesday April 25 2018, @12:11AM (#671429) Journal

    That said, some government is *NECESSARY*, but there's no effective way of constraining the top level of government in it's quest for additional power.

    How big (in terms of geography or number of citizens vicariously controlled, take your pick) would you like the top level of government to be? Put another way, how many top level governments would you like to see coexisting in the world?

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by HiThere on Wednesday April 25 2018, @05:22PM (3 children)

    by HiThere (866) Subscriber Badge on Wednesday April 25 2018, @05:22PM (#671707) Journal

    That question depends on too many unspecified factors.

    OTOH, if there are going to be nuclear weapons, etc., around the only potentially safe (hah! Check the history of China) number is one. Unfortunately, history shows that about one king in five will be some kind of wacko. In the Chinese case he had a case of megalomania that considered all history before his reign to be a personal affront, and spent a lot of time destroying it to such effect that we know little of Chinese history before his time.

    So basically my only answer is wait for the Singularity, hope that it comes in time, and also hope that we live through that transition of power. I give us a 50% chance of surviving the transition to SuperHuman AI in control. But if we survive that, there won't be anymore such problems. OTOH, if multiple human governments controlling nuclear weapons remain the condition, I estimate we have a 30% chance of surviving the century...and I hope I'm not being optimistic. And then we have the problem of surviving the next century.

    *IF* SuperHumanAI doesn't show up, then our only semi-longterm hope for survival is multiple self-sufficient mobile space colonies. But those are pretty slow in showing up, and require good social engineering, a nearly closed ecosystem, and probably fusion power, though possibly fission power could be made to work for awhile. I consider SuperHumanAI to be much more likely to get here first. (My estimate of 50% chance of surviving the transition to power of the SuperHumanAI is based on the problem of it's goals when created. And that it will extrapolate those goals far beyond what the designers conceived. [Please note: This already happens with nearly every complex program, so that's not unexpected. Unintelligent chess programs play games that their designers could never contemplate.])

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 2) by JNCF on Wednesday April 25 2018, @06:50PM (2 children)

      by JNCF (4317) on Wednesday April 25 2018, @06:50PM (#671781) Journal

      I mostly agree with your analysis. I think I might see the potential pitfalls of centralization as being even more distasteful than you do -- even in a best case scenario where everything is run by AI that doesn't kill us all -- and I might be more willing to roll the dice on the existential destruction of life on Earth to avoid those pitfalls. I also think that while superhuman AI does allow complete centralization in a way that hasn't been achievable before, we could also have a future where multiple superhuman AIs exist as competing actors that coexist through MAD game theory.

      • (Score: 2) by HiThere on Wednesday April 25 2018, @11:30PM (1 child)

        by HiThere (866) Subscriber Badge on Wednesday April 25 2018, @11:30PM (#671940) Journal

        You're making lots of assumptions that may or may not be valid about the way a SuperHumanAI would handle things. I don't think you can really presume that it's going to be coercive of centralization, or much of anything else. This partially depends on what it's developed out of, and entirely depends on what its goals are. But, as I said, even knowing the goals we wouldn't necessarily be able to predict its choices. Except in very simple cases...and even then, edge cases show up where you don't expect them.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 2) by JNCF on Thursday April 26 2018, @03:22AM

          by JNCF (4317) on Thursday April 26 2018, @03:22AM (#672007) Journal

          In the line I think you're referring to I said "could," not "would." I think I'm pretty open minded about what could happen with a superhuman AI, though there are some scenarios I judge to be more likely than others.