Slash Boxes

SoylentNews is people

posted by Dopefish on Monday February 24 2014, @06:00AM   Printer-friendly
from the i-for-one-welcome-our-new-computer-overlords dept.

kef writes:

"By 2029, computers will be able to understand our language, learn from experience and outsmart even the most intelligent humans, according to Google's director of engineering Ray Kurzweil.

Kurzweil says:

Computers are on the threshold of reading and understanding the semantic content of a language, but not quite at human levels. But since they can read a million times more material than humans they can make up for that with quantity. So IBM's Watson is a pretty weak reader on each page, but it read the 200m pages of Wikipedia. And basically what I'm doing at Google is to try to go beyond what Watson could do. To do it at Google scale. Which is to say to have the computer read tens of billions of pages. Watson doesn't understand the implications of what it's reading. It's doing a sort of pattern matching. It doesn't understand that if John sold his red Volvo to Mary that involves a transaction or possession and ownership being transferred. It doesn't understand that kind of information and so we are going to actually encode that, really try to teach it to understand the meaning of what these documents are saying.

Skynet anyone?"

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by Namarrgon on Wednesday February 26 2014, @03:18AM

    by Namarrgon (1134) on Wednesday February 26 2014, @03:18AM (#7094)

    The model is not of consciousness, but of the entity itself as distinct from the environment around it. I'm suggesting that the existence of this inclusion of "self" in the system of models is a major factor in what we call "consciousness"

    All animals model the environment around them, to gain understanding of how to use or avoid it. Is that scent good to eat? Is that shadow likely to eat me? Babies rapidly learn to do this too, and it's interesting to observe a child's environmental models growing in complexity.

    At a certain age (usually 2-3), toddlers start to include themselves in this system of models, to predict how they themselves will react to a situation. This comes with a growing awareness of "I" as distinct from "other" (which is usually reflected in their increasing use of personal pronouns), and enables them to grow beyond pure reaction and instinct towards making "conscious" decisions to manipulate the environment to better suit this new self they have become aware of. A fascinating process that most parents are familiar with.

    Of course, there may be many more important factors in consciousness at work, but at the least it strikes me as an approach that's worth exploring.

    Why would anyone engrave Elbereth?
  • (Score: 1) by drgibbon on Wednesday February 26 2014, @07:14AM

    by drgibbon (74) on Wednesday February 26 2014, @07:14AM (#7170) Journal

    Understood. But again, it does not follow that a model of an entity has that consciousness will necessarily have consciousness itself. I agree that people and animals build what could be called cognitive models of the environment and of themselves, and it's very interesting, but it does not mean that we should equate the phenomena of conscious experience with these models. Although having a sense of self may be brought to mind when one thinks about consciousness, there are states of consciousness where the self does not even exist! So yes, self-awareness (in the usual sense of "my mind", "my body", "my life", and so on) can be divorced from conscious experience. What I was really getting at there was your statement;

    "One could make the argument that consciousness ("self-awareness" at least) is largely a sufficiently detailed model of oneself".

    I would say that the fundamental aspects of consciousness are not captured by this definition. If we confuse the things that seem to rely on consciousness (e.g. self-awareness, intelligence, and so on) with the phenomena of consciousness itself (i.e. the capacity to subjectively experience reality), we run into problems. Not only does the word consciousness cease to have any precise or useful meaning, but we are led down the (IMO) garden path of attributing consciousness to anything that can mimic these models.

    I think that the development of children and so on is a fascinating area, but studies in that direction would seem to be more properly called cognitive/developmental, rather than of consciousness per se.

    Certified Soylent Fresh!
    • (Score: 1) by Namarrgon on Thursday February 27 2014, @09:29AM

      by Namarrgon (1134) on Thursday February 27 2014, @09:29AM (#7861)

      the phenomena of consciousness itself (i.e. the capacity to subjectively experience reality)
      Not really the definition I had in mind - but that's part of the problem; nobody really knows.

      I would have said that in order to subjectively experience anything, one had to be aware of oneself first, and be aware of the effect that experience has on oneself - which to me implies a self-model.

      But I'll happily concede my opinion is no better than any other, and we won't really know anything much for sure until we try. Which I guess was the original point; it's an approach worth trying, and we'll see how it turns out. At the least, we'll learn something.

      Why would anyone engrave Elbereth?