Neural networks were all the rage for a while, but progress eventually slowed and interest cooled. Then, as computing power increased, the field experienced a renaissance, and deep learning was the new big thing.
Throughout this ebb and flow of interest, there has been an underlying, annoying fact: neural networks as currently implemented are not that great. Especially when you compare them with the brain of... well, pretty much any creature. Researchers have been trying to make neural networks that have all the advantages of the brain (and none of the disadvantages) for as long as the field has existed. And it may be that they've gone about it wrong. Now, some new work is suggesting that the only way to get the advantages of the brain is to accept the disadvantages as well.
Source:
(Score: 4, Insightful) by JoeMerchant on Sunday March 05 2017, @02:48PM (5 children)
It's an evolved system, optimized across hundreds of millions of generations with trillions of experimental trials running simultaneously throughout that period. Do you really think you can just "do it over" in silicon and metal and improve on the original by orders of magnitude? Within the next 5-10 years? In a couple of puny labs staffed by a few dozen people?
If "better brains" can be made of silicon and metal, odds are that evolution would have some examples of them already - there's plenty of silicon and metal available for organisms to use. Birds do use metals and magnetics for navigation. Several animals have harnessed large scale electrical fields to their advantage. I'm not aware of any semiconductor / transistor like biological structures, probably because they cost so much to make and do so little for the maker.
Silicon/metal computational devices do what they do, and they do some "brain like" things much better than biological brains, but they are an emergent property of modern industrialized society, with enormous environmental requirements to create, operate and maintain them. Vacuum tube (transistor-like amplifiers) are just over 100 years old, silicon transistors are just over 70, and LSI photolithography is only 46 years old - put another way: integrated circuits with more than 100 logic gates per chip came out the same year as James Bond Live and Let Die: 1971.
In the current (industrialized society) environment, computational hardware has evolved and proliferated at an astounding rate. (Commonly used) CPU clock speeds rocketed up to the 2-5GHz range by 2006, and have since leveled off. Process densities plummeted from millimeters to 14nm and are just starting to level out. Cost of manufacture has dropped from 20 hours (circa 1975) minimum wage to buy a digital watch with ~30 monochrome display segments and less than 1000 transistors operating around 60KHz, controlled from three buttons, to 20 hours (circa 2015) minimum wage to buy a digital cell phone with >2 million tri-color display pixels, 3 billion transistors operating around 2GHz, controlled by voice, touch gesture, multi-megapixel photo input, nine axis IMU, GPS and remote network connection. In other words, in 40 years the available computational power in a "cheap portable device" has increased by a factor of roughly 100Billion.
And "true AI" has been 5 to 10 years away, all that time. Asimov published his "three laws of Robotics" and "positronic brains" in 1942, 5 years before the silicon transistor. With literally billions of these high powered computational devices networked globally, we have created a new ecosystem in which software entities are quickly evolving - mostly through direct creation by people, but increasingly people are creating automatons of various degrees of autonomy, and as these automatons interact and evolve new things will emerge, some of which we may not immediately understand how they work, what they are doing, and what they might do in the future. There are no effective "laws of robotics" - we do attempt to control network propagation of some self-replicating entities, to various degrees of success. Things are evolving which are starting to pass the Turing test, but they're different than biological brains, different than human consciousness. Some day, perhaps soon, they may become indistinguishable from human consciousness, but they will always be different. Unless/until they start replicating the same structure and limitations inherent in a biological brain, they will always be different, and if they are attempting to mimic the function of a biological brain, they will be less efficient.
Meanwhile, "artificial brains" are evolving into something entirely different, more adapted to the silicon/metal environment that we have created for them. Replication of the functions of biological brains are just a tiny fraction of what they can, and will, do - as long as there is an environment for them to do it in. We have been more than doubling their "available living space" every year since 1977 - and now they're beating us at poker. What's next?
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 0) by Anonymous Coward on Sunday March 05 2017, @07:54PM (1 child)
False. Nature wouldn't evolve such life if there were problems that made it unsuitable for the environment. For example, the abundance of water that could harm the system, rigidity of the system, or inability of enzymes to produce the system. The most successful organisms, bacteria, do not have brains at all.
Humans can create things that would never be produced by evolution due to their fragility. How about a hybrid created using an oversized disembodied brain and a computer, with no body and no capability for moving or feeding itself? It could be a "better brain" even if its fitness and reproductive capability is zero.
A lab can't do it in 10 years? Most of the hard work of miniaturizing computer components and nanoscale manufacturing processes has been done for them. They can use existing processes to create biomimetic plastic synapses or brain-computer interfaces.
(Score: 2) by JoeMerchant on Sunday March 05 2017, @08:27PM
I'll grant that the existing wondrous world of computing infrastructure can support new things not seen in nature, but if a micro-miniature silicon metal neural net that can drop-in cooperate with existing "wetware" were practical, there's a strong chance evolution would have tried it by now.
We're making "artificial eyes" and other brain-machine interface devices today, hideously complex and expensive things that are only a fraction as good as their natural counterparts, but since we can't make the natural counterparts, having a 0.2% efficient vision system is better than no vision system at all.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by acid andy on Monday March 06 2017, @12:40AM (2 children)
Apologies for the pedantry but Live and Let Die came out in 1973!
Master of the science of the art of the science of art.
(Score: 2) by JoeMerchant on Monday March 06 2017, @01:30AM (1 child)
How right you are - Diamonds are Forever was 1971, and I was even visualizing the satellite unfurling its reflector dish as I typed the comment - it's just that damn Paul McCartney theme song that won't leave me alone...
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by acid andy on Monday March 06 2017, @07:39PM
Y'know you did... Y'know you did... Y'know you did......
Master of the science of the art of the science of art.