A team of computational scientists and researchers has developed a new artificial intelligence (AI) framework that allows for accelerated, scalable, and reproducible detection of gravitational waves.
The production-scale framework indicates that AI models could be as sensitive as traditional template matching algorithms but orders of magnitude faster.
Furthermore, these AI algorithms would only require an inexpensive graphics processing unit (GPU), like those found in video gaming systems, to process advanced LIGO data faster than in real-time.
The team behind the AI Framework included Eliu Huerta of the US Department of Energy’s (DOE) Argonne National Laboratory, in conjunction with collaborators from Argonne, the University of Chicago, the University of Illinois at Urbana-Champaign, graphic chip-maker NVIDIA, and tech giant IBM.
“As a computer scientist, what’s exciting to me about this project is that it shows how, with the right tools, AI methods can be integrated naturally into the workflows of scientists — allowing them to do their work faster and better — augmenting, not replacing, human intelligence,” said Ian Foster, director of Argonne’s Data Science and Learning (DSL) division.
The team has published a paper in the journal Nature Astronomy, showcasing a data-driven approach that combines the team’s collective supercomputing resources to enable reproducible, accelerated, AI-driven gravitational wave detection.
When gravitational waves were first detected in 2015 by the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), they sent a ripple through the scientific community, as they confirmed another of Einstein’s theories and marked the birth of gravitational wave astronomy.
Five years later, numerous gravitational wave sources have been detected, including the first observation of two colliding neutron stars in gravitational and electromagnetic waves.
“In this study, we’ve used the combined power of AI and supercomputing to help solve timely and relevant big-data experiments. We are now making AI studies fully reproducible, not merely ascertaining whether AI may provide a novel solution to grand challenges,” Huerta noted.
Building upon the interdisciplinary nature of this project, the team looks forward to new applications of this data-driven framework beyond big-data challenges in physics.
Huerta and his research team developed their new framework through the support of the NSF, Argonne’s Laboratory Directed Research and Development (LDRD) program and DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.
“These NSF investments contain original, innovative ideas that hold the significant promise of transforming the way scientific data arriving in fast streams are processed,” said Manish Parashar, director of the Office of Advanced Cyberinfrastructure at NSF.