Synthetic universes: How simulations will help search for dark energy

The Dark Energy Survey is one of the most ambitious astrophysics experiments ever launched. For five years, a custom-designed camera mounted on a telescope in Chile will collect images of distant galaxies in the southern sky over an area of 5,000 square degrees, corresponding to roughly one-eighth of the visible universe.

That project will generate petabytes (thousands of terabytes) of data that must be painstakingly analyzed by the collaboration of scientists from 27 institutions to find answers about the nature of dark energy, dark matter and the forces that shape the evolution of the universe.

But the real data collected by that camera is only a fraction of the work in store for the Dark Energy Survey team. As part of the survey Simulation Working Group, Andrey Kravtsov and Matthew Becker of the University of Chicago (in collaboration with researchers at Stanford University and University of Michigan) are building and running complex computer simulations modeling the evolution of the matter distribution in the universe.

By the end of the project, these simulations may increase the data analysis demands of the survey by as much as a hundredfold. Why is such a large investment of time and effort in simulations needed? Accuracy, Kravtsov said.

“Essentially, for achieving the scientific goals of the survey and interpreting results, you need computer simulations to tell you how to interpret the data,” said Kravtsov, a Computation Institute fellow and professor of astronomy and astrophysics. “This kind of procedure is fairly common in particle physics experiments, but I don’t think it has ever been done in astrophysical experiments. But it has to be done with an experiment on this scale.”

Dark matter and dark energy have never been directly observed, and yet astronomers estimate that the two elements make up as much as 96 percent of the matter and energy of the universe.

The existence of dark energy was proposed to explain the famous 1998 discovery that the expansion of the universe is accelerating — a great surprise to astrophysicists, since Einstein’s theory of general relativity predicts that universal expansion should be slowing due to the gravitational pull of the universe’s mass. In current models, dark energy effectively acts as an anti-gravity force that overwhelms the effect of gravity, driving the observed acceleration of expansion.

Given that telescopes cannot directly observe dark matter or measure dark energy, astrophysicists must rely on indirect measurements, which are quite difficult.

Enter the Dark Energy Survey, which achieved “first light” last month and will use the high-quality images of the sky taken by the Fermilab-designed 570-megapixel Dark Energy Camera (DECam) to probe dark energy using four different methods.

In some ways, collecting the data is only the beginning. It’s how that data is analyzed and interpreted that will really change what’s known about the fundamental parameters that shaped the universe. So in order to make sure those data analysis methods are accurate, the Simulation Working Group will work with the collaboration to test them via a series of blind cosmology challenges.

“What we want to do is give people who will be analyzing the actual data a synthetic data set based on simulations,” Kravstov said. “We’ll give them synthetic observational data that is generated based on a hypothetical synthetic universe with parameters that we know, but they don’t know. All they see is the distribution of galaxies on the sky and in space, as would be observed in the real survey.”

Think of the board game Clue, where the truth (the murderer, the murder weapon, and the crime scene) is hidden in a manila envelope at the start of the game. The players then use educated guesses and logic to narrow down the possibilities and seek the correct answer. In the blind cosmology challenge, Dark Energy Survey scientists won’t know the parameters that went into creating the synthetic data they are analyzing, and will use their tools and algorithms to try to figure them out.

“The idea is that they’ll apply their analysis techniques, and we’ll see whether they actually recover the true parameters,” Kravstov continued. “If they do, it means that the techniques they will use on the real data are robust and trustworthy. If they don’t, we’ll have to go back to the drawing board and figure out where the problem is, and repeat that process.”

Suffice to say, simulating the evolution of matter in a universe is a little more complicated than shuffling some Clue cards. Each simulation starts with a universe 100 million years after the big bang, then tracks the motion of billions of particles through billions of years, forming large-scale structures akin to the cosmic web observed in the real universe.

Additional code is required to model how an observer sees that synthetic sky from a particular position within the universe, simulating how photons emitted by a distant star pinball through the universe deflected by mass concentrations within the large-scale structures. The technical aspects and potential biases of the DECam itself must also be simulated, so that the “images” collected in the synthetic universe are as close to the real data as possible.

For the simulations, Becker, a graduate student in the physics department, implemented an algorithm to simulate a particularly useful distortion called weak gravitational lensing. As light travels from distant galaxies to the telescope on Earth, its path is bent by gravity as it passes large mass concentrations within the cosmic web, so that the image received by the telescope may be subtly stretched or magnified.

The Dark Energy Survey will use this lensing effect in the real data to detect the presence of clumps of dark matter that can’t be directly observed, so the effect needed to be present in the synthetic data as well.

“Weak gravitational lensing is really important for the DES and any survey like it,” Becker said, “because it’s sensitive to the actual mass distribution, which carries a huge amount of information about how large-scale structure in the universe grew in the past.”

While the new camera in Chile is still being tuned before it begins collecting data for the survey, the simulation team already has completed several of its synthetic universe simulations — each requiring hundreds of thousands of computing hours and generating tens of terabytes of data.

As the blind cosmology challenges proceed and real data is added into the mix, Kravtsov expects that more than a hundred simulated synthetic universes may eventually be required to optimally fine-tune the analysis methods until new, accurate insight about the history of the universe can be obtained.

“Ultimately, simulations will be primarily used to test the methods that will be employed to analyze the survey data,” Kravtsov said. “It’s part of the methodology that allows you to derive the science from the survey, but it’s kind of technical. We’re interested ultimately in getting the science out.”