One of the biggest problems with computers, dating to the invention of the first one, has been finding ways to keep them cool so that they don't overheat or shut down.
Instead of combating the heat, two University of Nebraska-Lincoln engineers have embraced it as an alternative energy source that would allow computing at ultra-high temperatures.
Sidy Ndao, assistant professor of mechanical and materials engineering, said his research group's development of a nano-thermal-mechanical device, or thermal diode, came after flipping around the question of how to better cool computers.
"If you think about it, whatever you do with electricity you should (also) be able to do with heat, because they are similar in many ways," Ndao said. "In principle, they are both energy carriers. If you could control heat, you could use it to do computing and avoid the problem of overheating."
They documented their device working in temperatures up to 630 degrees Fahrenheit (332 Celsius).
(Score: 2) by Spamalope on Thursday April 20 2017, @05:26PM (1 child)
If it can find a niche case to pay to develop the tech into something inexpensive, strap it onto an exhaust/boiler/chimnies to make use of otherwise waste heat.
If it can be used simply to make extremely heat tolerant circuit it'll have applications for active sensors in hot areas. A self powered wifi monitor in a power plant steam pipe might make an improved flow rate or pressure sensor. Or be able to get one somewhere important where current sensors have practical issues. Or be in a Nuke plant, and not require external power so it'll continue working in an emergency giving it a safety advantage.
(Score: 2) by Scruffy Beard 2 on Thursday April 20 2017, @09:43PM
More likely, the high temperatures allow the waste heat to be used for other things.
For some reason 3 of the articles I checked about "how water cooling" used JavaScript to blank/dim the page. No links then.