Scientists create largest-ever simulation of the universe
Previously, simulating the universe at this scale was not possible due to the immense timescales involved and limitations in computational power.
Scientists from Oak Ridge National Laboratory used the world's fastest supercomputer, Frontier, to create the largest astrophysical simulation of the universe ever made, named ExaSky. The simulation covers a volume exceeding 31 billion cubic megaparsecs, allowing researchers to model the expansion and evolution of the universe with unprecedented detail.
"There are two elements in the universe: dark matter, which interacts only through gravity, and ordinary matter, or atomic matter," project leader Salman Habib from Argonne National Laboratory explained. By incorporating both elements, the simulation provides a new foundation for studying and modeling ordinary and dark matter.
With the enormous computing power of Frontier and updated algorithms, they were able to include more objects and achieve high physical realism in their model of the expanding universe.
"Until recently, we couldn't even imagine doing a large simulation like this except in approximation with gravity only," Habib stated. The new simulation overcomes previous limitations that required leaving out many variables, setting a new benchmark for cosmological hydrodynamics simulations.
The simulation allows scientists to study not only the effect of gravity but also other physical phenomena, which Habib refers to as the astrophysical "kitchen sink." Simulating both gravity and other physical processes, including the formation of stars, galaxies, and black holes, is essential for a comprehensive understanding of the universe.
The simulation's size corresponds to surveys undertaken by large telescope observatories, such as the Rubin Observatory in Chile, which look at huge chunks of time—billions of years of expansion.
"It's not only the sheer size of the physical domain, which is necessary to make direct comparison to modern survey observations enabled by exascale computing. It's also the added physical realism of including the baryons and all the other dynamic physics that makes this simulation a true tour de force for Frontier" said Bronson Messer, an astrophysicist at Oak Ridge National Laboratory and OLCF director of science.
The supercomputer code used in the simulation is called HACC, short for Hardware/Hybrid Accelerated Cosmology Code. HACC was developed around 15 years ago for petascale machines and has been continuously enhanced. As part of ExaSky, the HACC research team spent the last seven years adding new capabilities to the code and re-optimizing it to run on exascale machines powered by GPU accelerators.
Running on the exascale-class Frontier supercomputer, HACC was nearly 300 times faster than the reference run, demonstrating the need for sophisticated mathematics and extremely powerful supercomputers for these simulations. A requirement of the Exascale Computing Project was for codes to run approximately 50 times faster than they could before on Titan, the fastest supercomputer at the time of the project's launch.
The results of the simulation will help cosmologists understand the evolution and physics of the universe, including insights into dark matter and other key processes. Although it will still take time before detailed analyses based on the simulation are published, the research team has created high expectations by sharing a small slice of the simulation. A video released by the ExaSky team shows the formation of galaxy clusters in a slice of space with a volume of 311,296 cubic megaparsecs, which constitutes only 0.001% of the total volume of the simulation.
Simulations like ExaSky solve the problem of observing cosmic evolution over long timescales. By observing distant objects, scientists are actually seeing them as they were billions of years ago, which helps in reconstructing the evolution of the universe. Modeling the evolution of the cosmos allows scientists to manipulate variables, accelerate time, and gain a clearer view of how events in the cosmos have developed.
"If we were to simulate a large chunk of the universe surveyed by one of the big telescopes such as the Rubin Observatory in Chile, you're talking about looking at huge chunks of time—billions of years of expansion," Habib said, according to Science Alert.
Prior to runs on Frontier, parameter scans for HACC were conducted on the Perlmutter supercomputer at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. HACC was also run at scale on the exascale-class Aurora supercomputer at Argonne Leadership Computing Facility.
This article was written in collaboration with generative AI company Alchemiq
Jerusalem Post Store
`; document.getElementById("linkPremium").innerHTML = cont; var divWithLink = document.getElementById("premium-link"); if (divWithLink !== null && divWithLink !== 'undefined') { divWithLink.style.border = "solid 1px #cb0f3e"; divWithLink.style.textAlign = "center"; divWithLink.style.marginBottom = "15px"; divWithLink.style.marginTop = "15px"; divWithLink.style.width = "100%"; divWithLink.style.backgroundColor = "#122952"; divWithLink.style.color = "#ffffff"; divWithLink.style.lineHeight = "1.5"; } } (function (v, i) { });