A new way of creating a neural network using specially formulated memristors has been described by a team of researchers from Stony Brook University and the University of California Santa Barbara. The process has the potential to place an entire neural network on a single chip:
The system produced by the authors here involved only a 12-by-12 grid of memristors, so it's pretty limited in capacity. But Robert Legenstein, from Austria's Graz University of Technology, writes in an accompanying perspective that "If this design can be scaled up to large network sizes, it will affect the future of computing."
That's because there are still many challenges where a neural network can easily outperform traditional computing hardware—and do so at a fraction of the energy cost. Even on a 30 nm process, it would be possible to place 25 million cells in a square centimeter, with 10,000 synapses on each cell. And all that would dissipate about a Watt.
Training and operation of an integrated neuromorphic network based on metal-oxide memristors [abstract]
(Score: 2) by threedigits on Friday May 08 2015, @08:00AM
From the article, this prototype has 30 synapses. This means we would need 250 to simulate a C. Elegans neural network (7.500 synapses), 3 millions to simulate an ant (10^8 synapses), and
3 * 10^12 (3 hundred billions) for a human brain (10^14 synapses).
The speculated chip (not the one they actually built) would include 25 million cells, each with 10,000 synapses, for a total of 2.5 * 10^11 synapses, which is enough to simulate an small army of 1000 ants (if I get those numbers right). For a human brain you would still need 1,000 of them, so it will take an small room and a kW of power. But in 40-50 years our kids will be using one to keep the time in their watches.