Supernova Nickel and Neural Nets

Accurate models of supernova spectra are typically pretty slow, but neural networks are both fast and capable of mimicking many other codes. Recently, a team used the latter facts to overcome the former challenge and created a set of neural networks capable of rapidly modeling real supernova data.

Complex Explosions

As one might expect when dealing with one of the most violent, rapid, and energetic processes in the universe, one must take great care when modeling a supernova. In the seconds, hours, and days after the initial explosion, many different processes unfold on many different scales: unstable elements birthed from the raw power of the eruption decay into more durable forms; heavy elements are accelerated to mind-boggling speeds; nearby material begins to glow as the temperatures approach values beyond comprehension. Each of these processes and others affect the spectra that we measure here on Earth, and each of them must be closely tracked in order to correctly explain our observations.

The immense complexity of the problem places large demands on the codes that simulate the aftermath of a supernova. Despite many different simplifications and approximations, the inescapable fact remains that each of these simulations take a lot of time to run. For example, one code named TARDIS requires about one CPU hour to transform inputs such as total luminosity, time after explosion, and the structure of the ejecta into an output model spectrum. For astronomers who want to compare their real data to potentially millions of models to find the best-fitting values for those input parameters, that runtime won’t cut it.

A recently adopted unspoken code among the astronomy community states that if one scientist says “complex model” and “slow runtime” three times fast, another will appear with a machine learning solution to the problem. A team led by Xingzhuo Chen, Texas A&M University, just published a neural-network-accelerated supernova inference framework that answers this latest call.

Neural Networks and Nickel

A loss vs. epoch plot, showing exponentially decreasing trends for several curves.

The training progress of the neural networks as a function of iterations through the training data. [Chen et al. 2024]

Chen and collaborators set out to model real Type Ia supernovae using neural networks capable of rapidly converting spectra into underlying physical parameters. The first step required creating and training the networks. To do this, the team fed TARDIS 108,389 different input values and compiled the resulting spectra into a training library. They then let several neural networks loose in this library and tasked them with learning how to convert the spectra into their original inputs, or essentially how to undo all of TARDIS’s hard work. After a few tens of iterations through the library, the networks could faithfully mimic TARDIS in reverse, and the team was left with a several networks capable of constraining different physical parameters when shown a spectrum.

The inferred nickel abundance over time for several real supernovae. The expected decay rate is shown in the solid line. [Chen et al. 2024]

The researchers then handed over about 1,000 spectra of about 100 distinct supernovae to their trained networks and analyzed the resulting predictions. One quantity they focused on was the nickel content of the ejecta and how it evolved over time. They found that for some supernovae, the radioactive nickel depleted at exactly the predicted half-life of the unstable isotopes. Others, however, either had nickel stick around for longer than expected or fade away faster than the familiar rate. Exactly why this might be is unclear, and the mismatched cases could point to issues with the approximations made by TARDIS itself, or with assumed spherical symmetry/smoothness of their ejecta model.

Even still, the authors state that their work “represent[s] a significant improvement in the quality of the spectral modeling of [supernovae] compared to similar models in the literature.” By using this technique of harnessing neural networks to speed up complex physical models, astronomers can look forward to many more breakthroughs that wouldn’t be possible without an AI-assisted boost.

Citation

“Artificial Intelligence Assisted Inversion (AIAI): Quantifying the Spectral Features of 56Ni of Type Ia Supernovae,” Xingzhuo Chen et al. 2024 ApJ 962 125. doi:10.3847/1538-4357/ad0a33