Testing Cosmology with the Dark Energy Survey Five-Year Supernova Dataset

Editor’s Note: Astrobites is a graduate-student-run organization that digests astrophysical literature for undergraduate students. As part of the partnership between the AAS and astrobites, we occasionally repost astrobites content here at AAS Nova. We hope you enjoy this post from astrobites; the original can be viewed at astrobites.org.

Title: The Dark Energy Survey: Cosmology Results with ~1500 New High-Redshift Type Ia Supernovae Using the Full 5-Year Dataset
Authors: Dark Energy Survey Collaboration
Status: Published in ApJL

The Dark Energy Survey (DES) Collaboration is an international team of scientists that aims to measure and understand the nature of an elusive energy density component in the universe, dark energy. The DES was conducted using the 4-metre Blanco Telescope at the Cerro Tololo Inter-American Observatory in Chile and took observations from 2013 to 2019. The survey used a special camera called the Dark Energy Camera (DECam). DECam has a wide field of view (about 14 times the size of the full Moon in the sky), allowing for detection of galaxies over a large sky area. It also allows for sensitive measurements of the redshifted light from these galaxies with a 570-megapixel camera with 74 CCDs with minimal readout noise in the measurements. This research article specifically focuses on the results of the DES supernova survey (more about their other survey data can be seen here), which was designed to test cosmology with a large sample of supernova observations.

In 1998, two teams of scientists measuring the brightness of supernovae unexpectedly discovered anomalously faint supernovae at specific times in the earlier universe, indicating that the universe is accelerating in its expansion (they got a Nobel Prize for this discovery). Before 1998, cosmologists believed in three possibilities for future expansion of the universe: it would either stop and reverse (resulting in a collapse), it would come to a halt (resulting in a static universe), or it would reach a constant expansion rate. These scenarios assumed the universe only consisted of matter being influenced by gravity and radiation.

The discovery of an accelerating expansion changed this. Type Ia supernovae have a standard brightness, which allows us to determine their distance based on how faint they look in a telescope. We can also measure the redshift of supernovae from spectra, which can be compared to predictions from cosmological models that relate the redshifting of the light to the universe’s expansion rate over time and how far the light has travelled. Thus, we can plot the observed redshifts of the supernovae against their distance (from the measurements of their brightness). This plot is known as a Hubble diagram and can be used to fit a cosmological model. In today’s article, the DES Collaboration has done exactly this to test cosmological models. This time, however, instead of only the 52 supernovae that the discoverers of dark energy had in 1998, there are 1,635 supernovae in the DES five-year dataset — more than 30 times more!

Lighting the Way with the Universe’s Candles

The 1,635 supernovae found and used by DES (after quality cuts) cover redshifts greater than z ~ 0.1, so 194 Type Ia supernovae from samples external to DES are included in the data analysis to cover low redshifts (see Figure 1). In total, this resulted in an analysis of 1,829 supernovae. Part of the cuts to the data involved removing contaminants — transients that look like Type Ia supernovae but might actually be something else. In order to distinguish between the Type Ia supernovae and the contaminants, two machine-learning classifiers were used; they were trained on simulated Type Ia supernova light curves or Type II (core-collapse) supernova light curves (see more about different supernova classification in this bite and Type Ia light curves here).

Hubble diagram of Dark Energy Survey supernovae

Figure 1: The Hubble diagram of the DES supernovae from the five-year sample (blue points) and the external data (orange points) used in the analysis. The lower panel shows the difference from the measured and theoretical distance moduli for the best fit to a time-varying dark energy model. [DES Collaboration 2024]

Supernovae can be classified using spectroscopy, but in the DES analysis the machine learning classifies them using multi-band photometry. This is akin to low-resolution spectroscopy, as the flux from the supernovae is measured in a few different filters, instead of many different wavelengths. This approach allowed DES to observe many more supernovae than before in their survey. The classifiers gave the supernovae a probability of being a Type Ia, as shown in Figure 1 above, and these probabilities were used as weights in the model fitting analysis. To remove human bias in the analysis, the pipelines used were tested on blinded data — that is, the data were made to look different deliberately. This allows for one to ensure the pipeline works well and that those completing the analysis do not introduce bias towards an expected result.

Hints of Time-Varying Dark Energy?

In the standard model of cosmology, ΛCDM, dark energy is assumed to have a constant energy density — i.e., a cosmological constant. This model has been favoured by DES data previously. Furthermore, the results from measurements of the cosmic microwave background by the Planck space mission have preferred this model, and a ΛCDM model with zero curvature — that is to say, the universe has a flat geometry meaning that two parallel beams of light will stay parallel as they propagate through spacetime. If the universe has a curved geometry, the beams can eventually diverge or cross over (see more description here). However, the DES collaboration tests the data with various models: standard ΛCDM, a “flat” ΛCDM (zero curvature is assumed), and two time-varying dark energy models (also with the flat assumption).

The tests on the DES data alone and with combinations of external data for standard ΛCDM and flat ΛCDM find results consistent with those found previously for the matter density of the universe and the curvature — the fitted values are equal to those found previously by Planck within ~95% confidence bounds.

However, the story changes slightly for the time-varying dark energy models. In the first, wCDM allows dark energy to vary over time, letting the equation-of-state parameter, w, vary as a free parameter instead of being fixed to w = −1. In the second model, w0waCDM, the equation of state is modelled with a redshift dependence. One should find in the first model that w = −1, or in the second model that w0 = −1 and wa = −1, to be consistent with a cosmological constant dark energy. These constraints are not exactly favoured by the DES data and combinations, as shown by Figure 2 below.

contours and likelihoods for modeled parameters

Figure 2: Contours (best fit-regions in the parameter space) and likelihoods (conditional probability distributions for the fits to the parameters) for the fits to the matter density, Ωm, and dark energy equation of state parameters, for the w0waCDM model. The different coloured contours and likelihoods represent the different data combinations indicated by the legend. [DES Collaboration 2024]

There is a marginal preference for a time-varying equation of state as shown by the results above — the data prefer this over a model with a cosmological constant with ~95% confidence — just over 2σ. The best fits from the combination of Planck, DES and eBOSS find w = −0.773 (+0.075/−0.067) and wa = −0.83 (+0.033/−0.042) for the w0waCDM. The best fit for wCDM is w = −0.941 ± 0.026.

While we can’t confidently state from these results that dark energy must be time varying, these results here could be a hint at new physics to be discovered by cosmology in the future — but only further analysis and data can tell.

Original astrobite edited by Kylee Carden.

About the author, Abbé Whitford:

I am a third-year PhD student at the University of Queensland, studying large-scale structure cosmology with galaxy clustering and peculiar velocities, and using large-scale structure to measure the properties of neutrinos.