Accurate models of supernova spectra are typically pretty slow, but neural networks are both fast and capable of mimicking many other codes. Recently, a team used the latter facts to overcome the former challenge and created a set of neural networks capable of rapidly modeling real supernova data.
Complex Explosions
As one might expect when dealing with one of the most violent, rapid, and energetic processes in the universe, one must take great care when modeling a supernova. In the seconds, hours, and days after the initial explosion, many different processes unfold on many different scales: unstable elements birthed from the raw power of the eruption decay into more durable forms; heavy elements are accelerated to mind-boggling speeds; nearby material begins to glow as the temperatures approach values beyond comprehension. Each of these processes and others affect the spectra that we measure here on Earth, and each of them must be closely tracked in order to correctly explain our observations.
The immense complexity of the problem places large demands on the codes that simulate the aftermath of a supernova. Despite many different simplifications and approximations, the inescapable fact remains that each of these simulations take a lot of time to run. For example, one code named TARDIS requires about one CPU hour to transform inputs such as total luminosity, time after explosion, and the structure of the ejecta into an output model spectrum. For astronomers who want to compare their real data to potentially millions of models to find the best-fitting values for those input parameters, that runtime won’t cut it.
A recently adopted unspoken code among the astronomy community states that if one scientist says “complex model” and “slow runtime” three times fast, another will appear with a machine learning solution to the problem. A team led by Xingzhuo Chen, Texas A&M University, just published a neural-network-accelerated supernova inference framework that answers this latest call.
Neural Networks and Nickel

The training progress of the neural networks as a function of iterations through the training data. [Chen et al. 2024]

The inferred nickel abundance over time for several real supernovae. The expected decay rate is shown in the solid line. [Chen et al. 2024]
Even still, the authors state that their work “represent[s] a significant improvement in the quality of the spectral modeling of [supernovae] compared to similar models in the literature.” By using this technique of harnessing neural networks to speed up complex physical models, astronomers can look forward to many more breakthroughs that wouldn’t be possible without an AI-assisted boost.
Citation
“Artificial Intelligence Assisted Inversion (AIAI): Quantifying the Spectral Features of 56Ni of Type Ia Supernovae,” Xingzhuo Chen et al. 2024 ApJ 962 125. doi:10.3847/1538-4357/ad0a33