New universe simulation contains 60 trillion particles, the most ever


[ad_1]

Today, the greatest mysteries facing astronomers and cosmologists are the roles that gravitational pull and cosmic expansion play in the evolution of the Universe. To solve these mysteries, astronomers and cosmologists take a two-pronged approach. These consist of directly observing the cosmos to observe these forces at work while trying to find theoretical resolutions for the observed behaviors – such as dark matter and dark energy.

Between these two approaches, scientists model cosmic evolution with computer simulations to see if the observations match theoretical predictions. The latest is AbacusSummit, a simulation suite created by the Flatiron Institute’s Center for Computational Astrophysics (CCA) and the Harvard-Smithsonian Center for Astrophysics (CfA). Capable of processing nearly 60,000 billion particles, this suite is the largest cosmological simulation ever.

The creators of AbacusSummit announced the simulation sequel in a series of articles in the Monthly notices from the Royal Astronomical Society (MNRAS). Composed of over 160 simulations, it models the behavior of particles in a box-shaped environment due to gravitational attraction. These models are known as N-body simulations and are intrinsic to modeling how dark matter interacts with baryonic matter (aka “visible”).

The simulated distribution of dark matter in galaxies. Credit: Brinckmann et al.

Development of the AbacusSummit simulation suite was led by Lehman Garrison (a CCA researcher) and Nina Maksimova and Daniel Eisenstein, a graduate student and professor of astronomy at CfA (respectively). The simulations were run on the Summit supercomputer at the Oak Ridge Leadership Computing Facility (ORLCF) in Tennessee – overseen by the US Department of Energy (DoE).

N-body calculations, which involve calculating the gravitational interaction of planets and other objects, are among the greatest challenges facing astrophysicists today. Part of what makes it intimidating is that each object interacts with every other object, regardless of their distance – the more objects under consideration, the more interactions to consider.

To this day, there is still no solution for N-body problems where three or more massive bodies are involved, and the calculations available are only approximations. For example, the math to calculate the interaction of three bodies, such as a binary star system and a planet (known as the “three body problem”), has not yet been solved. A common approach with cosmological simulations is to stop the clock, calculate the total force acting on each object, slowly advance time, and repeat.

For their research (led by Maksimova), the team designed their codebase (called Abacus) to take advantage of Summit’s parallel processing power – whereby multiple calculations can be performed simultaneously. They also relied on machine learning algorithms and a new numerical method, which allowed them to calculate 70 million particles per node / s at the start and 45 million particle updates per node / s at times. late.

A snapshot of one of the AbacusSummit simulations, shown at different zoom scales: 10 billion light-years in diameter, 1.2 billion light-years wide, and 100 million light-years wide . Credit: The AbacusSummit Team / layout by Lucy Reading-Ikkanda / Simons Foundation

As Garrison explained in a recent CCA press release:

“This sequel is so large that it probably contains more particles than all the other N-body simulations that have ever been run combined – although that’s a difficult claim to confirm. Galaxy surveys provide extremely detailed maps of the Universe, and we need equally ambitious simulations that cover a wide range of possible universes in which we could live.

“AbacusSummit is the first suite of such simulations that has the scale and fidelity to compare to these amazing observations… Our vision was to create this code to provide the simulations that are needed for this particular new brand of galaxy investigation . We wrote the code to do the simulations much faster and with much more precision than ever before. ”

In addition to the usual challenges, running full N-body computational simulations requires that the algorithms be carefully designed due to all the memory storage involved. This means that Abacus was unable to make copies of the simulation for different supercomputer nodes and divided each simulation into a grid instead. This consists of making approximate calculations for distant particles, which play a lesser role than nearby particles.

It then divides neighboring particles into multiple cells so the computer can work on each independently, then combines the results of each with the distant particle approximation. The research team found that this approach (uniform divisions) makes better use of parallel processing and allows much of the distant particle approximation to be calculated before the simulation begins.

Parallel computer processing of Abacus, visualized. Credit: Lucy Reading-Ikkanda / Simons Foundation

This is a significant improvement over other N-body code bases, which unevenly divide simulations based on particle distribution. Thanks to its design, Abacus can update 70 million particles per node / second (where each particle represents a cluster of dark matter with three billion solar masses). It can also analyze the running simulation and look for spots of dark matter that indicate the presence of bright star-forming galaxies.

These and other cosmological objects will be the subject of future studies that will map the cosmos in unprecedented detail. These include the Dark Energy Spectroscopic Instrument (DESI), the Nancy Grace Roman Space Telescope (RST) and ESA Euclid spatialship. One of the goals of these big budget missions is to improve estimates of the cosmic and astrophysical parameters that determine the behavior and appearance of the Universe.

This, in turn, will allow more detailed simulations that use updated values ​​for various parameters, such as dark energy. Daniel J. Eisenstein, researcher at CfA and co-author of the article, is also a member of the DESI collaboration. He and others like him are anxiously awaiting what Abacus can do for these cosmological studies in the years to come.

“Cosmology is taking a leap forward thanks to the multidisciplinary fusion of spectacular observations and advanced computing,” he said. “The coming decade promises to be a wonderful time in our study of the historical sweep of the universe.”

Further reading: Simons Foundation, MNRAS

[ad_2]

About Johnnie Gross

Check Also

Anil Kapoor tells George Clooney about his grandson Vayu’s first ‘exposure to the universe’, says he’s ‘slowly connecting’ with him

Anil Kapoor loves the experience of being a grandfather, and in a recent chat with …