https://spectrum.ieee.org/reversible-computing [ieee.org]
Michael Frank has spent his career as an academic researcher working over three decades in a very peculiar niche of computer engineering. According to Frank, that peculiar niche’s time has finally come. “I decided earlier this year that it was the right time to try to commercialize this stuff,” Frank says. In July 2024, he left his position as a senior engineering scientist at Sandia National Laboratories to join a startup, U.S. and U.K.-based Vaire Computing.
Frank argues that it’s the right time to bring his life’s work—called reversible computing—out of academia and into the real world because the computing industry is running out of energy. “We keep getting closer and closer to the end of scaling energy efficiency in conventional chips,” Frank says. According to an IEEE semiconducting industry road map report Frank helped edit, by late in this decade the fundamental energy efficiency of conventional digital logic is going to plateau, and “it’s going to require more unconventional approaches like what we’re pursuing,” he says.
As Moore’s Law stumbles and its energy-themed cousin Koomey’s Law slows, a new paradigm might be necessary to meet the increasing computing demands of today’s world. According to Frank’s research at Sandia, in Albuquerque, reversible computing may offer up to a 4,000x energy-efficiency gain compared to traditional approaches.
“Moore’s Law has kind of collapsed, or it’s really slowed down,” says Erik DeBenedictis, founder of Zettaflops, who isn’t affiliated with Vaire. “Reversible computing is one of just a small number of options for reinvigorating Moore’s Law, or getting some additional improvements in energy efficiency.”
Vaire’s first prototype, expected to be fabricated in the first quarter of 2025, is less ambitious—it is producing a chip that, for the first time, recovers energy used in an arithmetic circuit. The next chip, projected to hit the market in 2027, will be an energy-saving processor specialized for AI inference. The 4,000x energy-efficiency improvement is on Vaire’s road map but probably 10 or 15 years out.
...
Intuitively, information may seem like an ephemeral, abstract concept. But in 1961, Rolf Landauer at IBM discovered a surprising fact: Erasing a bit of information in a computer necessarily costs energy, which is lost as heat. It occurred to Landauer that if you were to do computation without erasing any information, or “reversibly,” you could, at least theoretically, compute without using any energy at all.
Landauer himself considered the idea impractical. If you were to store every input and intermediate computation result, you would quickly fill up memory with unnecessary data. But Landauer’s successor, IBM’s Charles Bennett, discovered a workaround for this issue. Instead of just storing intermediate results in memory, you could reverse the computation, or “decompute,” once that result was no longer needed. This way, only the original inputs and final result need to be stored.
Take a simple example, such as the exclusive-OR, or XOR gate. Normally, the gate is not reversible—there are two inputs and only one output, and knowing the output doesn’t give you complete information about what the inputs were. The same computation can be done reversibly by adding an extra output, a copy of one of the original inputs. Then, using the two outputs, the original inputs can be recovered in a decomputation step.