Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Technology 2026-03-16 3 min read

The world's biggest neutrino experiment needs AI to make sense of its data deluge

A Rice University workshop brought 60 researchers together to plan how machine learning will power the Deep Underground Neutrino Experiment's computing infrastructure.

Somewhere deep beneath the Black Hills of South Dakota, a massive detector array is being built to catch particles so elusive they can pass through a light-year of lead without stopping. The Deep Underground Neutrino Experiment (DUNE) is designed to study neutrinos - the second most abundant particles in the universe and among the least understood. But the scientific ambition comes with a practical problem: the experiment will produce data on a scale that traditional analysis methods simply cannot handle.

Sixty researchers, one computing challenge

In mid-March, Rice University hosted the first dedicated workshop focused on integrating artificial intelligence and machine learning into DUNE's computing infrastructure. Held over three days at Rice's BioScience Research Collaborative, the event drew 60 registered participants and 30 presentations from universities, national laboratories, and international partners.

The workshop was organized by DUNE's AI/ML Forum and Core Software and Computing Consortium, with partial support from the Rice Creative Ventures Fund. Its goal was not to showcase AI demos but to solve a coordination problem: how do you build a computing ecosystem for an experiment of this scale that can actually support machine learning workflows?

Finding needles in neutrino haystacks

DUNE will operate detectors separated by a 1,300-kilometer baseline. A high-intensity neutrino beam generated at Fermilab in Illinois will travel through the Earth to the detector array at the Sanford Underground Research Facility in South Dakota. The detectors will observe neutrino oscillations - the phenomenon in which neutrinos switch between different types as they travel - to explore questions about why matter exists and how supernova explosions work.

The challenge is that the signals DUNE is looking for are extremely rare events buried in vast amounts of background noise. Aaron Higuera Pichardo, assistant research professor of physics at Rice, put it plainly: machine learning can identify small features in complex data that would be very difficult to detect using conventional techniques.

But deploying ML in a physics experiment is not as simple as training a neural network on labeled data. The models need to work within a massive, distributed computing infrastructure shared by a global collaboration. Training data must be generated through physics simulations that are themselves computationally expensive. And the results must meet the rigorous standards of particle physics, where claiming a discovery requires statistical certainty far beyond what most commercial AI applications demand.

Beyond analysis: automated operations

AI's role in DUNE extends beyond data analysis. Researchers are exploring ways to use machine learning for real-time detector monitoring - automated systems that can flag unusual patterns or potential equipment issues early, improving data quality while reducing the need for constant human oversight.

Rice postdoctoral researcher Ilker Parmaksiz presented work on GPU-accelerated optical simulations designed to speed up the complex physics simulations DUNE requires. Another Rice-led project, developed by computer science student Calvin Wong, is an AI-driven system called DUNE-Pro, a software platform designed to assist with complex data management tasks across DUNE's computing resources.

Coordination across a global collaboration

DUNE involves hundreds of researchers across dozens of institutions worldwide. Many individual groups have been developing AI tools for their specific analysis tasks, but the workshop's purpose was to identify synergies and avoid duplicated effort. Christopher Marshall, associate professor of physics at the University of Rochester, noted that individual groups often plan separately, and this workshop was about exploiting overlaps among different efforts.

The push aligns with broader national initiatives, including the U.S. Department of Energy's Genesis Mission and its emphasis on using advanced computing to accelerate scientific discovery from subatomic to cosmic scales.

The spacecraft arrives in 2029

DUNE is still under construction, with full operations expected in the coming years. That timeline creates both urgency and opportunity: the computing infrastructure decisions being made now will determine how effectively the experiment can use AI when data starts flowing. Getting the architecture right before the data arrives is substantially easier than retrofitting it afterward.

For a field that has always pushed computing limits - CERN's Large Hadron Collider famously catalyzed the development of the World Wide Web - the challenge of integrating modern AI into particle physics represents the next chapter of that tradition.

Source: Workshop held March 10-12, 2026, at Rice University's BioScience Research Collaborative. Organized by DUNE's AI/ML Forum and Core Software and Computing Consortium. DUNE is an international collaboration with detectors at Fermilab (Illinois) and the Sanford Underground Research Facility (South Dakota).