kef writes:
"By 2029, computers will be able to understand our language, learn from experience and outsmart even the most intelligent humans, according to Google's director of engineering Ray Kurzweil.
Kurzweil says:
Computers are on the threshold of reading and understanding the semantic content of a language, but not quite at human levels. But since they can read a million times more material than humans they can make up for that with quantity. So IBM's Watson is a pretty weak reader on each page, but it read the 200m pages of Wikipedia. And basically what I'm doing at Google is to try to go beyond what Watson could do. To do it at Google scale. Which is to say to have the computer read tens of billions of pages. Watson doesn't understand the implications of what it's reading. It's doing a sort of pattern matching. It doesn't understand that if John sold his red Volvo to Mary that involves a transaction or possession and ownership being transferred. It doesn't understand that kind of information and so we are going to actually encode that, really try to teach it to understand the meaning of what these documents are saying.
Skynet anyone?"
(Score: 2, Insightful) by radu on Monday February 24 2014, @10:37AM
> After all I don't see why a tool would need to be conscious to "understand that if John sold his red Volvo to Mary that involves a transaction or possession and ownership being transferred."
Maybe you don't see why, but Google surely does, actually it's exactly the kind of information Google wants.
(Score: 0) by Anonymous Coward on Tuesday February 25 2014, @03:49AM