Function-Space Decoupled Diffusion for Forward and Inverse Modeling in Carbon Capture and Storage
2026-02-12 • Machine Learning
Machine Learning
AI summaryⓘ
The authors developed Fun-DDPS, a new method to better understand how fluids move underground for carbon storage, even when there is very little measured data. Their method uses a combination of a special kind of AI model called a diffusion model and a neural operator that respects the physics of fluid flow. This approach helps fill in missing details accurately and efficiently. They tested it on simulated data and showed it works much better than older methods, especially when data is sparse, and creates realistic underground scenarios without strange errors.
Carbon Capture and Storagesubsurface flowinverse problemsdiffusion modelsneural operatorsdata assimilationforward modelingRejection SamplingJensen-Shannon divergencephysics-based modeling
Authors
Xin Ju, Jiachen Yao, Anima Anandkumar, Sally M. Benson, Gege Wen
Abstract
Accurate characterization of subsurface flow is critical for Carbon Capture and Storage (CCS) but remains challenged by the ill-posed nature of inverse problems with sparse observations. We present Fun-DDPS, a generative framework that combines function-space diffusion models with differentiable neural operator surrogates for both forward and inverse modeling. Our approach learns a prior distribution over geological parameters (geomodel) using a single-channel diffusion model, then leverages a Local Neural Operator (LNO) surrogate to provide physics-consistent guidance for cross-field conditioning on the dynamics field. This decoupling allows the diffusion prior to robustly recover missing information in parameter space, while the surrogate provides efficient gradient-based guidance for data assimilation. We demonstrate Fun-DDPS on synthetic CCS modeling datasets, achieving two key results: (1) For forward modeling with only 25% observations, Fun-DDPS achieves 7.7% relative error compared to 86.9% for standard surrogates (an 11x improvement), proving its capability to handle extreme data sparsity where deterministic methods fail. (2) We provide the first rigorous validation of diffusion-based inverse solvers against asymptotically exact Rejection Sampling (RS) posteriors. Both Fun-DDPS and the joint-state baseline (Fun-DPS) achieve Jensen-Shannon divergence less than 0.06 against the ground truth. Crucially, Fun-DDPS produces physically consistent realizations free from the high-frequency artifacts observed in joint-state baselines, achieving this with 4x improved sample efficiency compared to rejection sampling.