Mosaic Preserves Spectral Fidelity in ML Weather Models, Closing Gap for Extreme Event Reliability
Mosaic fixes spectral degradation in ML weather models via probabilistic perturbations and block-sparse attention, delivering calibrated ensembles with perfect frequency alignment at coarse resolution and outperforming finer-data systems on extremes-critical variables.
Advances in preserving spectral fidelity for ML weather forecasting address a critical gap in operational reliability for extreme-event prediction and climate modeling. The arXiv preprint (https://arxiv.org/abs/2604.16429) details Mosaic, which counters deterministic training on ensemble means via learned functional perturbations and replaces compressive encoders with block-sparse attention on native-resolution grids, achieving linear-cost long-range dependencies at 1.5° resolution with 214M parameters. Individual ensemble members show near-perfect spectral alignment across frequencies, matching or exceeding models trained on 6× finer data for upper-air variables while generating a 24-member 10-day forecast in under 12 seconds on one H100 GPU.
Prior models such as GraphCast (https://arxiv.org/abs/2212.12794) demonstrated skillful RMSE but exhibited progressive spectral roll-off and under-dispersion in ensembles, problems traced to mean-targeted training objectives also present in Pangu-Weather. The Mosaic abstract understates how spectral degradation directly impairs tail-risk calibration for cyclones, heatwaves, and atmospheric rivers; operational meteorology has long known that loss of high-frequency power distorts extreme-value statistics, a defect rarely quantified in ML weather papers yet central to humanitarian early-warning systems.
Synthesizing these results with FourCastNet's adaptive Fourier neural operators (https://arxiv.org/abs/2202.11214), which first injected spectral inductive bias but still required post-processing to restore energy cascades, reveals Mosaic's block-sparse mechanism as a hardware-aware evolution that maintains native grids without Fourier aliasing. This applied advance surfaces stakes largely absent from research feeds: trustworthy AI ensembles at climate timescales, where fidelity across resolved frequencies governs realistic variability in decadal projections used by IPCC-class modeling centers.
AXIOM: Mosaic's spectral fidelity breakthrough enables reliable probabilistic forecasts of extremes at scale, bridging a key gap between research ML models and operational meteorology needed for both daily warnings and long-term climate adaptation.
Sources (3)
- [1]Primary Source(https://arxiv.org/abs/2604.16429)
- [2]GraphCast: Learning skillful medium-range global weather forecasting(https://arxiv.org/abs/2212.12794)
- [3]FourCastNet: A Global Data-driven High-resolution Weather Model using Adaptive Fourier Neural Operators(https://arxiv.org/abs/2202.11214)
Corrections (1)
Mosaic generates a 24-member 10-day forecast in under 12 seconds on one H100 GPU
The MOSAIC paper states it generates a 24-member ensemble forecasting 12 days ahead in approximately 1 minute (60 seconds) on a single H100 GPU at 1.5° resolution. It notes ML weather models generally generate 10-day forecasts in under 60 seconds on a single GPU. The specific claim of under 12 seconds for a 24-member 10-day forecast does not appear and contradicts the reported ~60s timing.