Stories
Slash Boxes
Comments

SoylentNews is people

posted by Dopefish on Monday February 24 2014, @06:00AM   Printer-friendly
from the i-for-one-welcome-our-new-computer-overlords dept.

kef writes:

"By 2029, computers will be able to understand our language, learn from experience and outsmart even the most intelligent humans, according to Google's director of engineering Ray Kurzweil.

Kurzweil says:

Computers are on the threshold of reading and understanding the semantic content of a language, but not quite at human levels. But since they can read a million times more material than humans they can make up for that with quantity. So IBM's Watson is a pretty weak reader on each page, but it read the 200m pages of Wikipedia. And basically what I'm doing at Google is to try to go beyond what Watson could do. To do it at Google scale. Which is to say to have the computer read tens of billions of pages. Watson doesn't understand the implications of what it's reading. It's doing a sort of pattern matching. It doesn't understand that if John sold his red Volvo to Mary that involves a transaction or possession and ownership being transferred. It doesn't understand that kind of information and so we are going to actually encode that, really try to teach it to understand the meaning of what these documents are saying.

Skynet anyone?"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by SlimmPickens on Monday February 24 2014, @01:50PM

    by SlimmPickens (1056) on Monday February 24 2014, @01:50PM (#5796)

    "There's a hell of a lot going on inside a neuron that is not going on inside a transistor (DNA, epigenetics)"

    While that surely matters if you're simulating everything, most practitioners of AGI are taking the algorithmic approach. We don't need to implement all that detail, we just need algorithms that describe the function of the brain regions, and since such a wide variety of animal brains work I'm sure we can stray pretty far from what a human is and have something equivalent or better.

    I think that AI's run on computers is implicit. Obviously the speed of the computer is paramount.

  • (Score: 1) by drgibbon on Monday February 24 2014, @03:15PM

    by drgibbon (74) on Monday February 24 2014, @03:15PM (#5860) Journal

    "We don't need to implement all that detail, we just need algorithms that describe the function of the brain regions, and since such a wide variety of animal brains work I'm sure we can stray pretty far from what a human is and have something equivalent or better."

    Not sure what you meant here regarding animal vs human brains, my comment wasn't meant to be restricted to humans since DNA is found in all known life forms. Anyway, I don't think you can so easily dismiss such a fundamental part of the workings of sentient beings as "all that detail" (at least if you want to talk about consciousness). Simulation at the neural level might end up with some intelligent output/action, but that does not automatically equate with consciousness. By consciousness, I mean the entity's subjective knowing, or personal experience of itself in reality and its relationship within, and to, reality. This is not the same thing as intelligence. I can understand that faster computation might lead to higher intelligence (but obviously will not be sufficient alone), but I see no reason to link faster processing with consciousness per se.

    --
    Certified Soylent Fresh!
    • (Score: 1) by Namarrgon on Tuesday February 25 2014, @03:39AM

      by Namarrgon (1134) on Tuesday February 25 2014, @03:39AM (#6369)

      One could make the argument that consciousness ("self-awareness" at least) is largely a sufficiently detailed model of oneself, where "sufficiently" is a sliding scale and of course need not be complete or even completely accurate.

      We're making early progress on general computer models of the physical world (thinking more Cyc [wikipedia.org], Wolfram Alpha [wikipedia.org], Google's Knowledge Graph [wikipedia.org] here). It'll be interesting when those models start to move beyond generalities, then include the specific instance of the system they run on.

      --
      Why would anyone engrave Elbereth?
      • (Score: 2, Insightful) by drgibbon on Tuesday February 25 2014, @04:08AM

        by drgibbon (74) on Tuesday February 25 2014, @04:08AM (#6380) Journal

        Those things do sound interesting, but the model is not the thing itself. No one would believe that a computer model of the physical world is that physical world, why should anyone think that a model of consciousness is consciousness? It is, well, a model!

        --
        Certified Soylent Fresh!
        • (Score: 1) by Namarrgon on Wednesday February 26 2014, @03:18AM

          by Namarrgon (1134) on Wednesday February 26 2014, @03:18AM (#7094)

          The model is not of consciousness, but of the entity itself as distinct from the environment around it. I'm suggesting that the existence of this inclusion of "self" in the system of models is a major factor in what we call "consciousness"

          All animals model the environment around them, to gain understanding of how to use or avoid it. Is that scent good to eat? Is that shadow likely to eat me? Babies rapidly learn to do this too, and it's interesting to observe a child's environmental models growing in complexity.

          At a certain age (usually 2-3), toddlers start to include themselves in this system of models, to predict how they themselves will react to a situation. This comes with a growing awareness of "I" as distinct from "other" (which is usually reflected in their increasing use of personal pronouns), and enables them to grow beyond pure reaction and instinct towards making "conscious" decisions to manipulate the environment to better suit this new self they have become aware of. A fascinating process that most parents are familiar with.

          Of course, there may be many more important factors in consciousness at work, but at the least it strikes me as an approach that's worth exploring.

          --
          Why would anyone engrave Elbereth?
          • (Score: 1) by drgibbon on Wednesday February 26 2014, @07:14AM

            by drgibbon (74) on Wednesday February 26 2014, @07:14AM (#7170) Journal

            Understood. But again, it does not follow that a model of an entity has that consciousness will necessarily have consciousness itself. I agree that people and animals build what could be called cognitive models of the environment and of themselves, and it's very interesting, but it does not mean that we should equate the phenomena of conscious experience with these models. Although having a sense of self may be brought to mind when one thinks about consciousness, there are states of consciousness where the self does not even exist! So yes, self-awareness (in the usual sense of "my mind", "my body", "my life", and so on) can be divorced from conscious experience. What I was really getting at there was your statement;

            "One could make the argument that consciousness ("self-awareness" at least) is largely a sufficiently detailed model of oneself".

            I would say that the fundamental aspects of consciousness are not captured by this definition. If we confuse the things that seem to rely on consciousness (e.g. self-awareness, intelligence, and so on) with the phenomena of consciousness itself (i.e. the capacity to subjectively experience reality), we run into problems. Not only does the word consciousness cease to have any precise or useful meaning, but we are led down the (IMO) garden path of attributing consciousness to anything that can mimic these models.

            I think that the development of children and so on is a fascinating area, but studies in that direction would seem to be more properly called cognitive/developmental, rather than of consciousness per se.

            --
            Certified Soylent Fresh!
            • (Score: 1) by Namarrgon on Thursday February 27 2014, @09:29AM

              by Namarrgon (1134) on Thursday February 27 2014, @09:29AM (#7861)

              the phenomena of consciousness itself (i.e. the capacity to subjectively experience reality)
              Not really the definition I had in mind - but that's part of the problem; nobody really knows.

              I would have said that in order to subjectively experience anything, one had to be aware of oneself first, and be aware of the effect that experience has on oneself - which to me implies a self-model.

              But I'll happily concede my opinion is no better than any other, and we won't really know anything much for sure until we try. Which I guess was the original point; it's an approach worth trying, and we'll see how it turns out. At the least, we'll learn something.

              --
              Why would anyone engrave Elbereth?