Photonic circuits are a very promising technology for neural networks because they make it possible to build energy-efficient computing units. For years, the Politecnico di Milano has been working on developing programmable photonic processors integrated on silicon microchips only a few mm2 in size for use in the field of data transmission and processing, and now these devices are being used to build photonic neural networks:
"An artificial neuron, like a biological neuron, must perform very simple mathematical operations, such as addition and multiplication, but in a neural network consisting of many densely interconnected neurons, the energy cost of these operations grows exponentially and quickly becomes prohibitive. Our chip incorporates a photonic accelerator that allows calculations to be carried out very quickly and efficiently, using a programmable grid of silicon interferometers. The calculation time is equal to the transit time of light in a chip a few millimeters in size, so we are talking about less than a billionth of a second (0.1 nanoseconds)," says Francesco Morichetti, Head of the Photonic Devices Lab of the Politecnico di Milano.
"The advantages of photonic neural networks have long been known, but one of the missing pieces to fully exploit their potential was network training.. It is like having a powerful calculator, but not knowing how to use it. In this study, we succeeded in implementing training strategies for photonic neurons similar to those used for conventional neural networks. The photonic 'brain' learns quickly and accurately and can achieve precision comparable to that of a conventional neural network, but faster and with considerable energy savings. These are all building blocks for artificial intelligence and quantum applications," adds Andrea Melloni, Director of Polifab the Politecnico di Milano micro and nanotechnology center.
Originally spotted on The Eponymous Pickle.
Journal Reference: Sunil Pai et al, Experimentally realized in situ backpropagation for deep learning in photonic neural networks, Science (2023). DOI: 10.1126/science.ade8450
Related: New Chip Can Process and Classify Nearly Two Billion Images Per Second
(Score: 2) by maxwell demon on Friday May 05, @06:45PM (1 child)
I guess that depends on the scale you are looking at.
In the short term, to my understanding (which may however be wrong, since I'm not an expert in that field) it is indeed just quite simple operations. Basically signals are transported (rather slowly, as the transport is not like current in a wire, but a very active process, which probably explains the high energy consumption, especially since typically a certain amount of signal also is sent in the "neutral" state), and at the synapses translated into chemical signals which then in turn are collected by the next neuron. I guess that can be modelled by relatively simple mathematical formulas.
In the long term, the brain grows new synapses or removes existing ones; I strongly doubt that this is a mathematically simple process (I have no idea how the brain decides where to grow a new synapse, though).
The Tao of math: The numbers you can count are not the real numbers.
(Score: 0) by Anonymous Coward on Saturday May 06, @02:20AM
Yeah I'm pretty sure that bit ain't simple math too...
So similarly in most cases the neuron is going - nothing new, if A happens, I do B. It's like the using artificial neural nets with the already trained weights.
Yes it seems to behave more intelligently after the "correct" weights are assigned but the real intelligence is in how the weights and connections got assigned in the first place.
But I guess for AI applications the usage of trained networks can still consume quite a lot of power? So this photonic stuff is supposed to help.
I'm pretty impressed by crow brains - small brains and yet relatively very intelligent.