Physicists, philosophers, professors, authors, cognitive scientists, and many others have weighed in on edge.org's annual question 2015: What do you think about machines that think? See all 186 responses here
Also, what do you think?
My 2ยข: There's been a lot of focus on potential disasters that are almost certainly not going to happen. E.g. a robot uprising, or mass poverty through unemployment. Most manufacturers of artificial intelligence won't program their machines to seek self preservation at the expense of their human masters. It wouldn't sell. Secondly, if robots can one day produce almost everything we need, including more robots, with almost no human labour required, then robot-powered factories will become like libraries: relatively cheap to maintain, plentiful, and a public one will be set up in every town or suburb, for public use. If you think the big corporations wouldn't allow it, why do they allow public libraries?
(Score: 2) by jbWolf on Friday January 23 2015, @08:22AM
His entire premise is heavily based on exponential growth, but not all of it. We know there are better computers than what we can build. Each of us has an example sitting inside our head. If you can take the processing power of a brain and merge that with the storage abilities of today's computers, you've already got a computer smarter than almost everyone on the planet. I think that is doable. What that artificial intelligence does after that will be interesting.
Again, he knows the software we can write is currently lacking, but the software that can do it already exists inside each of our heads. Once we can copy that and produce AI as smart as Einstein or Hawking, and network them together, that is when they will be able to write their own software and create the next generation beyond that. The idea that we can build the equivalent of a brain and supersede that even by a small amount is (in my opinion) doable. What they will be able to do after that is what is unknown.
www.jb-wolf.com [jb-wolf.com]
(Score: 3, Funny) by pnkwarhall on Friday January 23 2015, @06:31PM
If you can take the processing power of a brain[...]
the software that can do it already exists inside each of our heads
But we don't understand how the brain works...
All you're doing is repeating "what ifs" like the rest of the futurists who think technological advancement is the only form of **human** progression. We're the supposed products of millions of years of evolutionary development and progression, but according to you we can just "copy that" and combine it with technology developed over the last decade or two (i.e. basically micro-ICs and quantum computing tech in its infancy), and "whammo!" -- we've created "an intelligence".
I can already do that! These intelligences are called "children", and creating them is a relatively simple process and can be lots of fun to implement. But I'll be damned if we haven't had that technology for a really long time, and yet our problems aren't solved yet, and the intelligences we create don't seem to be much improved over the parent intelligence.
So, yes, let's discuss and pretend like we have all these other technological hurdles to AI "almost" overcome, and it's just a matter of putting things together in the right way. Now, once all these dominos are knocked down, THEN we have to start working on the same problems we already have, just with a "new" intelligence to help?
Sounds like THIS intelligence is already coming up with great solutions!!
Lift Yr Skinny Fists Like Antennas to Heaven
(Score: 1) by khallow on Friday January 23 2015, @08:20PM
So, yes, let's discuss and pretend like we have all these other technological hurdles to AI "almost" overcome, and it's just a matter of putting things together in the right way.
What is there to pretend? Humanity already creates new intelligences. We're just figuring out how to do it without requiring a billion years of evolution. It really is just a matter of engineering, putting things together in a right way.
(Score: 2) by maxwell demon on Saturday January 24 2015, @10:25AM
No. No human brain is a better computer than the computers we build. Yes, they are better brains than our computers are, that is, they are much better at brain-typical tasks than computers are. But in turn they completely suck at computer-type tasks. Even a decades-old computer outperforms me at simple computing tasks. And there's no way that I'll ever remember enough to fill all the mass storage media I've got at home. On the other hand, my computer sucks at tasks that are simple for me. For example, there's no way my computer could write a comment like this one.
The Tao of math: The numbers you can count are not the real numbers.