AI weather models use at least 21 times less energy than traditional forecasting systems
Thomas Rieutord had a straightforward question that nobody had answered with real numbers: how much energy does AI weather forecasting actually use compared with the traditional approach? The answer, published in Weather, is that AI models are dramatically more efficient, consuming at least 21 times less energy over a year of operation than the physics-based numerical weather prediction models they are beginning to supplement.
That finding matters because weather forecasting runs continuously, worldwide, every day. National meteorological services operate some of the largest supercomputers in existence to solve the equations of atmospheric physics. If AI can deliver comparable forecasts at a fraction of the energy cost, the carbon footprint of knowing whether it will rain tomorrow drops substantially.
The training cost is real but bounded
AI weather models require an upfront investment in training. A model learns the patterns of atmospheric behavior by processing decades of historical weather data, and this training phase consumes considerable energy. Critics of AI's environmental footprint often emphasize this cost.
Rieutord, who conducted the research while at Met Eireann in Ireland and is now at the Centre National de Recherche Meteorologique in France, acknowledges the training overhead but puts it in context. Once trained, an AI model generates a forecast in seconds or minutes on relatively modest hardware. A traditional numerical weather prediction model must solve millions of differential equations on a supercomputer for each forecast cycle, a process that takes hours and burns proportionally more energy every single time it runs.
Over a year of continuous operation, the cumulative inference savings from the AI models more than offset the one-time training costs. The 21-fold efficiency advantage is a conservative estimate; depending on the specific models compared and the frequency of forecast cycles, the actual savings could be larger.
What traditional models actually consume
Traditional numerical weather prediction (NWP) models work by dividing the atmosphere into a three-dimensional grid and solving the physical equations governing fluid dynamics, thermodynamics, and radiative transfer at each grid point. The finer the grid, the more accurate the forecast, but also the more computation required. Major operational centers run these models at grid spacings of a few kilometers, requiring sustained computation on supercomputers with thousands of processors.
These runs happen multiple times daily and are supplemented by ensemble forecasts, where the model is run dozens of times with slightly different initial conditions to estimate forecast uncertainty. The energy bill is enormous and ongoing.
AI data-driven models, by contrast, learn statistical relationships between atmospheric states and make predictions through rapid matrix operations. The mathematical structure is far simpler to execute at inference time, even if the training process that produces the model is computationally demanding.
Performance is catching up
The energy comparison is only meaningful if AI models produce useful forecasts. Recent years have seen rapid progress on this front. Several AI weather models now match or exceed the accuracy of traditional NWP models for certain forecast variables and time horizons, particularly for medium-range forecasts of a few days to two weeks. They are not yet consistently better across all variables and all conditions, but the gap is closing.
The practical implication is that meteorological services could potentially run AI models alongside traditional ones, using the AI system for routine forecasts and reserving the energy-intensive NWP runs for cases where physics-based precision is needed, such as severe weather events or situations outside the training data's range.
Limitations of the analysis
Rieutord describes the study as providing orders of magnitude rather than precise accounting. Energy consumption estimates for both AI and traditional models depend on hardware specifications, operational configurations, and assumptions about usage patterns that vary between meteorological centers. The study does not include energy costs for data preprocessing, storage, or distribution of forecasts, which are shared by both approaches.
The comparison also does not account for the energy required to generate the observational data that both systems depend on. Weather stations, radiosondes, satellites, and ocean buoys all consume energy that is external to the modeling systems themselves.
Rieutord hopes future studies will provide more precise estimates and that energy consumption reduction becomes an explicit design target for future weather models, alongside forecast accuracy. For now, the basic conclusion holds: AI weather forecasting is vastly more energy-efficient than the approach it is beginning to supplement.