The idea of a thinking machine is an amazing one. It would be like humans creating artificial life, only more impressive because we would be creating consciousness. Or would we ? It's tempting to think that a machine that could think would think like us. But a bit of reflection shows that's not an inevitable conclusion.
To begin with, we'd better be clear about what we mean by "think". A comparison with human thinking might be intuitive, but what about animal thinking? Does a chimpanzee think? Does a crow? Does an octopus ?
The philosopher Thomas Nagel said that there was "something that it is like" to have conscious experiences. There's something that it is like to see the colour red, or to go water skiing. We are more than just our brain states.
Could there ever be "something that it's like" to be a thinking machine? In an imagined conversation with the first intelligent machine, a human might ask "Are you conscious?", to which it might reply, "How would I know?".
http://theconversation.com/what-does-it-mean-to-think-and-could-a-machine-ever-do-it-51316
[Related Video]: They're Made Out of Meat
(Score: 2) by acid andy on Friday January 08 2016, @05:22PM
I think I see what you're getting at. I prefer Nagel's original words personally because the word "feeling" is very open to being misnterpreted. People will start to associate "feeling" with simple physical processes like nerve impulses and the chemistry that goes on while emotions occur. Analysis of those things usually excludes the direct, subjective experience of the feeling that consciousness seems to entail. You can only directly experience your own pleasure or pain or experience of colour (these fundamental conscious experiences are often called "qualia" by the way). An analysis of all the electrical impulses and chemical processes going on in another individual's brain for such an experience would seem to miss out something of that experience. That's what Nagel's getting at.
I personally think it might be theoretically possible (but probably isn't) to objectively model how someone does experience the colour red or the sensation of heat, if we could somehow crack the code of the entire internal representation of it in their neurons and synapses. For me, that still leaves consciousness itself unexplained because consciousness is the person's first person point of view.
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—