Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Saturday March 24 2018, @10:55PM   Printer-friendly
from the I-call-dibs...-Scarecrow dept.

A new version of the NEST algorithm could dramatically reduce the amount of memory required to run a whole human brain simulation, while increasing simulation speed on current supercomputers:

During the simulation, a neuron's action potentials (short electric pulses) first need to be sent to all 100,000 or so small computers, called nodes, each equipped with a number of processors doing the actual calculations. Each node then checks which of all these pulses are relevant for the virtual neurons that exist on this node.

That process requires one bit of information per processor for every neuron in the whole network. For a network of one billion neurons, a large part of the memory in each node is consumed by this single bit of information per neuron. Of course, the amount of computer memory required per processor for these extra bits per neuron increases with the size of the neuronal network. To go beyond the 1 percent and simulate the entire human brain would require the memory available to each processor to be 100 times larger than in today's supercomputers.

In future exascale computers, such as the post-K computer planned in Kobe and JUWELS at Jülich in Germany, the number of processors per compute node will increase, but the memory per processor and the number of compute nodes will stay the same.

Achieving whole-brain simulation on future exascale supercomputers. That's where the next-generation NEST algorithm comes in. At the beginning of the simulation, the new NEST algorithm will allow the nodes to exchange information about what data on neuronal activity needs to [be] sent and to where. Once this knowledge is available, the exchange of data between nodes can be organized such that a given node only receives the information it actually requires. That will eliminate the need for the additional bit for each neuron in the network.

With memory consumption under control, simulation speed will then become the main focus. For example, a large simulation of 0.52 billion neurons connected by 5.8 trillion synapses running on the supercomputer JUQUEEN in Jülich previously required 28.5 minutes to compute one second of biological time. With the improved algorithm, the time will be reduced to just 5.2 minutes, the researchers calculate.

Also at the Human Brain Project.

Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers (open, DOI: 10.3389/fninf.2018.00002) (DX)

Previously: Largest neuronal network simulation achieved using K computer (2013)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by fyngyrz on Sunday March 25 2018, @02:32AM (2 children)

    by fyngyrz (6567) on Sunday March 25 2018, @02:32AM (#657765) Journal

    The stated objective in the short term may be to simulate medical issues

    That's why I titled my post as I did. My interest is tangental.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Sunday March 25 2018, @03:08PM (1 child)

    by Anonymous Coward on Sunday March 25 2018, @03:08PM (#657919)

    I was really confused. I have a programmable thermostat that takes two "AA" batteries that I change whenever the clocks need to go forward or back.

    It has a tiny chip in it that lets me set the date and time. The NEST thing google bought didn't really strike me as a Neuromancer styled AI in some cloud of googles.

    That whole robot home spy on you having sex so it can make the temperatures appropriate based on the VagiVibe app you synced to the smart home system never seemed to need a humain brain modeled AI to operate; it seemed to violate privacy without any effort or thought at all! And I thought humans did watch that stuff in the cloud?