Every task we perform on our computer — whether number crunching, watching a video, or typing out an article — requires different components of the machine to interact with one another. "Communication is massively crucial for any computation," says former SFI Graduate Fellow Abhishek Yadav, a Ph.D. scholar at the University of New Mexico. But scientists don't fully grasp how much energy computational devices spend on communication.
Over the last decade, SFI Professor David Wolpert has spearheaded research to unravel the principles underlying the thermodynamic costs of computation. Wolpert notes that determining the "thermodynamic bounds on the cost of communication" is an overlooked but critical issue in the field, as it applies not only to computers but also to communication systems across the board. "They are everything that holds up modern society," he says.
Now, a new study in Physical Review Research, co-authored by Yadav and Wolpert, sheds light on the unavoidable heat dissipation that occurs when information is transmitted across a system, challenging an earlier view that, in principle, communication incurs no energetic cost. For the study, the researchers drew on and combined principles from computer science, communication theory, and stochastic thermodynamics, a branch of statistical physics that deals with real-world out-of-equilibrium systems that includes smartphones and laptops.
Using a logical abstraction of generic communication channels, the researchers determined the minimum amount of heat a system must dissipate to transmit one unit of information. This abstraction could apply to any communication channel — artificial (e.g., optical cable) or biological (e.g., a neuron firing a signal in the brain). Real-world communication channels always have some noise that can interfere with the information transmission, and the framework developed by Yadav and Wolpert shows that the minimum heat dissipation is at least equal to the amount of useful information — technically called mutual information — that filters through the channel's noise.
Then, they used another broadly applicable abstraction of how modern-day computers perform computations to derive the minimum thermodynamic costs associated with encoding and decoding. Encoding and decoding steps ensure reliable transmission of messages by mitigating channel noise. Here, the researchers gained a significant insight: improving the accuracy of data transmission through better encoding and decoding algorithms comes at the cost of increased heat dissipation within the system.
Uncovering the unavoidable energy costs of sending information through communication channels could help build energy-efficient systems. Yadav reckons that the von Neumann architecture used in current computers presents significant energetic costs associated with communication between the CPU and memory. "The principles that we are outlining can be used to draw inspiration for future computer architecture," he says.
As these energy costs apply to all communication channels, the work presents a potential avenue for researchers to deepen the understanding of various energy-hungry complex systems where communication is crucial, from biological neurons to artificial logical circuits. Despite burning 20% of the body's calorie budget, the brain uses energy far more efficiently than artificial computers do, says Yadav. "So it would be interesting to see how natural computational systems like the brain are coping with the cost associated with communication."
Journal Reference: Abhishek Yadav and David Wolpert, Minimal thermodynamic cost of communication, Phys. Rev. Research 7, 043324 – Published 22 December, 2025 DOI: https://doi.org/10.1103/qvc2-32xr
(Score: 5, Funny) by Runaway1956 on Saturday January 03, @02:52AM (2 children)
Who remembers the 79s and 80s, with all the gas saving devices on the market? You could get an air cooler that would make your car 20% more efficient, a magnetic fuel alignment thing to get another 20% efficiency, new tires that promised 10% more efficiency, and on and on it went. It got to the point that I had to stop at a gas station every 400 miles, to drain my gas tank into the gas station's big tank!
I hope these guys don't do the same with my computers! Imagine, having to ground your cellphone to something metallic every two or three days to prevent the battery exploding!
I don't mind saving energy, but there are limits.
ICE is having a Pretti Good season.
(Score: 3, Interesting) by krishnoid on Saturday January 03, @04:44AM
I heard about some people who said the carburetors could be way more efficient. But if you get more efficiency out of combustion, doesn't that just mean that the gas burns more completely ... generating more heat? Great! Your car now generates more heat, but does that translate to more splodey force?
(Score: 1, Funny) by Anonymous Coward on Saturday January 03, @02:29PM
It's like the cheat code to become rich off of automobile insurance companies. GEICO says their customers saved an average of hundreds of dollars switching to GEICO. Progressive says the same thing, and State Farm too. If we low ball it and assume "hundreds" is at least $200, then you can pull in over $70k a year just by switching your automobile insurance provider every day!
(Score: 3, Interesting) by bzipitidoo on Saturday January 03, @07:13AM (4 children)
Lot of complexity theory assumes that storing and retrieving data is an O(1) operation. I am not convinced that it is. Suppose you have a really big storage device. Basically, make a gigantic cylindrical storage medium the size of the Ringworld. (I think actually it'd have to be a ring of billions of small storage devices all in orbit like the particles that make up the rings of Saturn.) Further assume that info cannot travel faster than light speed. An algorithm that accesses data randomly is going to have to contend with the delays of this storage medium. It'd be similar to swap memory on a hard drive, a really slow hard drive that spins at 0.000002 rpm, none of this 5400 or 7200 rpm stuff. Sequential access could still be done in O(1) time, but not random access. The speed of this enormous storage device could be doubled by putting the computer (and read head) into a retrograde orbit. Fundamentally, the fastest a mega storage device could be randomly accessed is the cube root of the storage capacity. I've been thinking about this issue, and this article sounds like it raises a similar issue.
What do these considerations do to Complexity Theory?
(Score: 5, Informative) by Ingar on Saturday January 03, @10:32AM (3 children)
It is an O(1) operation, even a random seek on that 5400 RPM drive has an upper bound in terms of execution time, and the maximal total execution time scales linearly with the number of random seeks. The issue Complexity Theory doesn't address is that fact that this particular operation is magnitudes slower and more energy intensive than trivial arithmetic operations.
Computation is cheap, data transfer is prohibitively expensive.
This isn't new, has been well-researched and several mitigations are in place: your CPU has L1, L2, and L3 cache, your hard disk most likely has a DRAM buffer, and your OS caches
disk content in main memory. Your GPU has its own memory because fetching data from RAM over PCIe just isn't fast enough.
It's why I buy an entire bread and store it in my kitchen, instead of going to the bakery every time I want a slice.
Love is a three-edged sword: heart, soul, and reality.
(Score: 3, Insightful) by shrewdsheep on Saturday January 03, @02:07PM (1 child)
That being said, you do not account for the fact that the drive may fail and a backup has to be restored. Then the backup itself might have an error and a secondary backup might need to be tapped and so on. Access time is clearly O(N^alpha), N being storage size, where alpha depends on the failure probability.
The upshot is that O/o designations are quite arbitrary and depend strictly on spherical cow assumptions.
(Score: 2, Insightful) by khallow on Saturday January 03, @09:31PM
It's more Log(N). You can get enough backups to lower your error rate below probability of 1 for the entire lifespan of the storage system - should that be something you actually want.
(Score: 2) by bzipitidoo on Monday January 05, @08:00AM
Upper bound? You are saying that because the upper bound is a constant, the operation is O(1). That's wrong. If an Earth orbit (1 year orbit) sized data storage device seems fast enough that you can declare storage operations to all be O(1) time, how about a Planet 9 orbit (~10,000 Earth years orbit) sized storage device? Light speed across Earth orbit takes 16 minutes. Across the much larger orbit of Planet 9, light speed is roughly 2 full days.
(Score: 4, Touché) by bart9h on Saturday January 03, @03:15PM (1 child)
Here I was thinking it was other things entirely, like lack of empathy, unchecked corporate greed, waste and planed obsolescence, complete disregard for the environment of our one and only planet.
(Score: 1) by khallow on Saturday January 03, @09:36PM
Obvious rebuttal: no one has shown empathy matters here; greed is quite checked and modern societies have figured out how to harness it to positive end; waste is a trade off - you get to choose what you waste, but you don't get to choose not to waste; if planned obsolescence were important to us, we can deal with it by not buying the products in question; and we have plenty of regard for the environment of our present planet. Your refusal to acknowledge reality doesn't change reality.