Scientists simulated millions of universes on a supercomputer to study galaxy formation

Shawn Knight

Posts: 15,291   +192
Staff member
Through the looking glass: To better understand how galaxies actually form, we need to be able to observe their formation and growth over billions of years. In lieu of a time machine, scientists from the University of Arizona are using a supercomputer to simulate millions of universes and galaxies.

Scientists have studied the formation of galaxies using cutting-edge technology for decades yet even still, we’ve barely scratched the surface of understanding exactly how these massive bodies grow, evolve and behave.

It’s an issue of scale and time, really: they’re so enormously large and have been around for so long that our observations only provide a snapshot in time.

Peter Behroozi, an assistant professor at the UA Steward Observatory, said simulating a single galaxy requires 10 to the 48th computing operations. “All computers on Earth combined could not do this in a hundred years. So to just simulate a single galaxy, let alone 12 million, we had to do this differently,” he said.

As such, each simulated universe was coded to obey different physical theories on how galaxies should form. Over a three-week period, the “Ocelote” supercomputer at the UA High Performance Computing cluster crunched numbers, processing data on more than eight million simulated universes and 12 million galaxies spanning from about 400 million years after the Big Bang to present day.

The results have helped scientists understand why galaxies cease to form new stars even when they have plenty of hydrogen gas to do so.

“As we go back earlier and earlier in the universe, we would expect the dark matter to be denser, and therefore the gas to be getting hotter and hotter. This is bad for star formation, so we had thought that many galaxies in the early universe should have stopped forming stars a long time ago,” Behroozi said. “But we found the opposite: galaxies of a given size were more likely to form stars at a higher rate, contrary to the expectation.”

The team’s paper on the matter, UNIVERSEMACHINE: The Correlation between Galaxy Growth and Dark Matter Halo Assembly from z = 0-10, is available online should you want to dig deeper.

Masthead credit: spiral galaxy by Alex Mit. Star trails by Christina Siow.

Permalink to story.

 
Crazy how far astronomy science has come in such a short time. The Hubble telescope and space probes like Voyager, Dawn and Juno have a lot to do with that.

I was reading the other day there are an estimated 100 billion billion stars (suns) in the universe, most with planets orbiting around them. And that's considered a low number, it's likely the number is significantly higher.

It's really incomprehensible to imagine the scale involved.
 
We do not have a supercomputer powerful enough to simulate space-time formations within in our own galaxy, never mind 10^11 galaxies in our universe. And those geniuses managed to pull it for millions of universes? That must have been the latest CASIO calculator, and a ton of amphetamine.
 
Last edited:
"we need to be able to observe their formation and growth over billions of years"

We won't be here for that long.

If you have the belief that everything came from nothing, and non-observable evidence, that requires a lot of faith. You have the religion of uniformitarianism - that everything has been as it currently is. Without it, this entire belief system falls apart. Again, that belief is based off of assumptions.
 
Multiple supercomputer systems were actually used in the research, as detailed in the paper:
This research used the Edison and Cori supercomputers of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. An allocation of computer time from the UA Research Computing High Performance Computing (HPC) at the University of Arizona is gratefully acknowledged. This work performed (in part) at SLAC under DOE Contract DE-AC02-76SF00515. The Bolshoi–Planck simulation was performed by Anatoly Klypin within the Bolshoi project of the University of California High-Performance AstroComputing Center (UC-HiPACC; PI Joel Primack). Resources supporting this work were provided by the NASA High-End Computing (HEC) Program through the NASA Advanced Supercomputing (NAS) Division at Ames Research Center. The MultiDark–Planck2 simulation was performed by Gustavo Yepes on the SuperMUC supercomputer at LRZ (Leibniz-Rechenzentrum) using time granted by PRACE, project number 012060963 (PI Stefan Gottloeber).
The specifications of the Ocelete machine can be found here:

https://public.confluence.arizona.edu/display/UAHPC/Compute+Resources

https://lenovopress.com/lp0094-nextscale-nx360-m5-e5-2600-v4
 
Multiple supercomputer systems were actually used in the research, as detailed in the paper:

Amazing that they're "using" that much processing power and still can't fix their own equations, identify "dark matter", unify gravity and charge, or solve any other problems since 1925 in Copenhagen when physics died an inglorious death.

These guys can't even tell us what makes up 95% of the mass in the universe. A few years back, NASA even admitted they were more than 15% wrong in ALL their distance measurements ever, as well. Just pathetic.
 
Back