████ # This file was generated bot-o-matically! Edit at your own risk. ████
A Supercomputer Just Created the Largest Universe Simulation Ever [gizmodo.com]:
Last month, a team of researchers put the then-fastest supercomputer in the world to work on a rather large quandary: the nature of the universe’s atomic and dark matter.
The supercomputer is called Frontier; recently, a team of researchers recently used it to run the largest astrophysical simulation of the universe yet. The supercomputer’s simulation size corresponds to surveys taken by large telescope observatories, which to this point had not been possible. The calculations undergirding the simulations provide a new foundation for cosmological simulations of the universe’s matter content, from everything we see to the invisible stuff that only interacts with ordinary matter gravitationally.
What exactly did the Frontier supercomputer calculate?
Frontier is an exascale-class supercomputer, capable of running a quintillion (one billion-billion) calculations per second. In other words, a juiced machine worthy of the vast undertaking that is simulating the physics and evolution of both the known and unknown universe.
“If we want to know what the universe is up to, we need to simulate both of these things: gravity as well as all the other physics including hot gas, and the formation of stars, black holes and galaxies,” said Salman Habib, the division director for computational sciences at Argonne National Laboratory, in an Oak Ridge National Laboratory release [ornl.gov]. “The astrophysical ‘kitchen sink’ so to speak.”
The matter we know about—the stuff we can see, from black holes, to molecular clouds, to planets and moons—only accounts for about 5% of the universe’s content, according to CERN [home.cern]. A more sizable chunk of the universe is only inferred by gravitational effects it seems to have on the visible (or atomic) matter. That invisible chunk is called dark matter, a catch-all term for a number of particles and objects that could be responsible for about 27% of the universe [home.cern]. The remaining 68% of the universe’s makeup is attributed to dark energy, which is responsible for the accelerating rate of the universe’s expansion.
How does Frontier change our understanding of the universe?
“If we were to simulate a large chunk of the universe surveyed by one of the big telescopes such as the Rubin Observatory in Chile, you’re talking about looking at huge chunks of time — billions of years of expansion,” Habib said. “Until recently, we couldn’t even imagine doing such a large simulation like that except in the gravity-only approximation.”
In the top graphic, the left image shows the evolution of the expanding universe over billions of years in a region containing a cluster of galaxies, and the right image shows the formation and movement of galaxies over time in one section of that image.
“It’s not only the sheer size of the physical domain, which is necessary to make direct comparison to modern survey observations enabled by exascale computing,” said Bronson Messer, the director of science for Oak Ridge Leadership Computing Facility, in a laboratory release [ornl.gov]. “It’s also the added physical realism of including the baryons and all the other dynamic physics that makes this simulation a true tour de force for Frontier.”
Frontier is no longer the fastest supercomputer in the world
Frontier is one of several exascale supercomputers used by the Department of Energy, and comprises [ornl.gov] more than 9,400 CPUs and over 37,000 GPUs. It resides at Oak Ridge National Laboratory, though the recent simulations were run by Argonne researchers.
The Frontier results were possible thanks to the supercomputer’s code, the Hardware/Hybrid Accelerated Cosmology Code (or HACC). The fifteen-year-old code was updated as part of the DOE’s $1.8 billion, eight-year Exascale Computing Project [exascaleproject.org], which concluded this year.
The simulations’ results were announced last month, when Frontier was still the fastest supercomputer in the world. But shortly after, Frontier was eclipsed by the El Capitan supercomputer as the world’s fastest. El Capitan is verified at 1.742 quintillion calculations per second, with a total peak performance of 2.79 quintillion calculations per second, according to a Lawrence Livermore National Laboratory release [llnl.gov].
Daily Newsletter
Get the best tech, science, and culture news in your inbox daily.
Select
News from the future, delivered to your present.
Select
Please select your desired newsletters and submit your email to upgrade your inbox.
Sign me upLeave this field empty if you're human: You May Also Like Severance’s New Season 2 Trailer Puts the Work In [gizmodo.com] Secret Level Doesn’t Quite Go Gold [gizmodo.com] Giant Underground ‘Batteries’ Are Shaping the Future of Renewable Energy Storage [gizmodo.com] 28 Years Later’s First Teaser Hints at the Horror to Come [gizmodo.com] Landlords Are Using AI to Raise Rents—and Cities Are Starting to Push Back [gizmodo.com] More Actors Have Been Driven to Join Mattel’s Matchbox Movie [gizmodo.com] Scientists Built a Tiny DNA ‘Hand’ That Grabs Viruses to Stop Infections [gizmodo.com] Acer Swift 16 AI Review: A Beautiful Screen Connected to a Bad Chassis [gizmodo.com]