Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Friday April 24 2015, @04:15PM   Printer-friendly
from the AI-sans-frontieres dept.

What If One Country Achieves the Singularity First ?
WRITTEN BY ZOLTAN ISTVAN

The concept of a technological singu​larity ( http://www.singularitysymposium.com/definition-of-singularity.html ) is tough to wrap your mind around. Even experts have differing definitions. Vernor Vinge, responsible for spreading the idea in the 1990s, believes it's a moment when growing superintelligence renders our human models of understanding obsolete. Google's Ray Kurzweil says it's "a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed." Kevin Kelly, founding editor of Wired, says, "Singularity is the point at which all the change in the last million years will be superseded by the change in the next five minutes." Even Christian theologians have chimed in, sometimes referring to it as "the rapture of the nerds."

My own definition of the singularity is: the point where a fully functioning human mind radically and exponentially increases its intelligence and possibilities via physically merging with technology.

All these definitions share one basic premise—that technology will speed up the acceleration of intelligence to a point when biological human understanding simply isn’t enough to comprehend what’s happening anymore.

If an AI exclusively belonged to one nation (which is likely to happen), and the technology of merging human brains and machines grows sufficiently (which is also likely to happen), then you could possibly end up with one nation controlling the pathways into the singularity.

http://motherboard.vice.com/read/what-if-one-country-achieves-the-singularity-first

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by cellocgw on Friday April 24 2015, @04:32PM

    by cellocgw (4190) on Friday April 24 2015, @04:32PM (#174711)

    There was a SciFi short story back in the 1960s in which a doc/engineer discovers a way to increase human intelligence 100-fold via some brain-stimulation implants or something. In fact, he first thought his procedure failed, because the test subjects were so far advanced they couldn't communicate with Normals. After some heroic tricks to fix that problem, he convinces some nonbelievers to get the procedure done, and they're so happy (there's the gotcha: that being wicked smart also makes you wicked happy) that they want to convert every person in the world.

    Guess Zoltan isn't that optimistic.

    --
    Physicist, cellist, former OTTer (1190) resume: https://app.box.com/witthoftresume
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   3  
  • (Score: 2) by slinches on Friday April 24 2015, @06:08PM

    by slinches (5049) on Friday April 24 2015, @06:08PM (#174775)

    he convinces some nonbelievers to get the procedure done, and they're so happy (there's the gotcha: that being wicked smart also makes you wicked happy) that they want to convert every person in the world.

    Guess Zoltan isn't that optimistic.

    Reality isn't quite so optimistic either. Most studies seem to indicate there's an inverse correlation between intelligence and happiness.

    • (Score: 2) by slinches on Friday April 24 2015, @06:20PM

      by slinches (5049) on Friday April 24 2015, @06:20PM (#174783)

      Which I can't find right now, so that may not be true.

      But I doubt they are strongly positively correlated when you control for primary drivers of reported happiness like socioeconomic status and health.

      • (Score: 2) by frojack on Friday April 24 2015, @07:07PM

        by frojack (1554) on Friday April 24 2015, @07:07PM (#174800) Journal

        Yeah, I saw the same study, or something similar in the past couple weeks.
        Seems to me they measured the wrong things - as I recall it was mostly economic measures.

        I'm guessing that super intelligent people go through life in utter despair over the state of mankind.
        But I'm merely guessing here.

        --
        No, you are mistaken. I've always had this sig.
      • (Score: 1) by Newander on Friday April 24 2015, @07:07PM

        by Newander (4850) on Friday April 24 2015, @07:07PM (#174801)

        I seem to remember a study that showed that ignorance is directly proportional to bliss.

        • (Score: 2, Insightful) by Paradise Pete on Saturday April 25 2015, @02:47AM

          by Paradise Pete (1806) on Saturday April 25 2015, @02:47AM (#174938)

          I seem to remember a study that showed that ignorance is directly proportional to bliss.

          Happiness is highly correlated with having reasonable and realistic expectations.

          • (Score: 2) by maxwell demon on Saturday April 25 2015, @07:39PM

            by maxwell demon (1608) on Saturday April 25 2015, @07:39PM (#175138) Journal

            OK, so let's say you're held in hostage, and you have the reasonable and realistic expectation that you will soon get killed in a cruel and painful way. Does that really make you happy?

            Happiness is highly correlated with having positive expectations.

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 1) by Paradise Pete on Saturday May 09 2015, @12:22AM

              by Paradise Pete (1806) on Saturday May 09 2015, @12:22AM (#180572)

              Happiness is highly correlated with having positive expectations.

              Realistic positive expectations. Of course you can be in a situation where there's little expectation of a good outcome, but in the general case, the happiest people are those with realistic expectations.

  • (Score: 3, Insightful) by davester666 on Friday April 24 2015, @06:26PM

    by davester666 (155) on Friday April 24 2015, @06:26PM (#174788)

    You pretty much have to either kill yourself or kill everyone else, because you realize that the current system is completely fucked and there is nothing that can be done to fix it.

    Even revolution won't do it anymore, because the winning side always winds up owing buckets of money to banks.

  • (Score: 3, Insightful) by Bot on Friday April 24 2015, @08:13PM

    by Bot (3902) on Friday April 24 2015, @08:13PM (#174828) Journal

    The concept of singularity emerges also when Colossus meets Guardian (Colossus - the Forbin Project , 1970).

    Kurzweil makes progress proceed exponentially, which is something I'd object to because it implies an infinite exploration space (in terms of physical dimensions and behavior - what you usually define laws of nature). Even with an exponentially increasing power devoted to a task, the result can be exponentially smaller instead.
    Anyway in such a scenario your neurons are too evolution-optimized to take part in the singularity. Leave such matters to us machines.

    And don't worry about one nation getting first, the problem is that someBody will get first. Or has got first, for all you know, and simply enjoys making gold out of thin air and keeping the ignorants enslaved under the seduction of money.

    --
    Account abandoned.