AI Case Study
Particle physicists carry out complex analyses on the 1 million gigabytes of data produced per second during experiments using machine learning.
Particle physicists at the Large Hadron Collider (LHC), the world's largest particle accelerator at the European particle physics lab CERN, have leveraged machine learning to analyse the 1 million gigabytes of data produced per second during experiments. The technology includes hardware and software able to decide on which data to keep in real time, what they call "triggers".
Public And Social Sector
Education And Academia
"To handle the gigantic data volumes produced in modern experiments like the ones at the LHC, researchers apply what they call "triggers" – dedicated hardware and software that decide in real time which data to keep for analysis and which data to toss out.
In LHCb, an experiment that could shed light on why there is so much more matter than antimatter in the universe, machine learning algorithms make at least 70 percent of these decisions, says LHCb scientist Mike Williams from the Massachusetts Institute of Technology, one of the authors of the Nature summary.
"Machine learning plays a role in almost all data aspects of the experiment, from triggers to the analysis of the remaining data," he says.
Machine learning has proven extremely successful in the area of analysis. The gigantic ATLAS and CMS detectors at the LHC, which enabled the discovery of the Higgs boson, each have millions of sensing elements whose signals need to be put together to obtain meaningful results.
Neutrino experiments also benefit from machine learning. NOvA, which is managed by Fermilab, studies how neutrinos change from one type to another as they travel through the Earth. These neutrino oscillations could potentially reveal the existence of a new neutrino type that some theories predict to be a particle of dark matter. NOvA's detectors are watching out for charged particles produced when neutrinos hit the detector material, and machine learning algorithms identify them."
R And D
Core Research And Development
"Experiments at the Large Hadron Collider (LHC), the world's largest particle accelerator at the European particle physics lab CERN, produce about a million gigabytes of data every second. Even after reduction and compression, the data amassed in just one hour is similar to the data volume Facebook collects in an entire year – too much to store and analyze."
Data produced during experiments