Abstract
We demonstrate unprecedented accuracy for rapid gravitational wave parameter estimation with deep learning. Using neural networks as surrogates for Bayesian posterior distributions, we analyze eight gravitational wave events from the first LIGO-Virgo Gravitational-Wave Transient Catalog and find very close quantitative agreement with standard inference codes, but with inference times reduced from to 20 s per event. Our networks are trained using simulated data, including an estimate of the detector noise characteristics near the event. This encodes the signal and noise models within millions of neural-network parameters and enables inference for any observed data consistent with the training distribution, accounting for noise nonstationarity from event to event. Our algorithm—called “DINGO”—sets a new standard in fast and accurate inference of physical parameters of detected gravitational wave events, which should enable real-time data analysis without sacrificing accuracy.
- Received 1 July 2021
- Accepted 17 November 2021
DOI:https://doi.org/10.1103/PhysRevLett.127.241103
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI. Open access publication funded by the Max Planck Society.
Published by the American Physical Society