← Content
Engineering · 8 min read · April 28, 2026

Learning turbulence closures via nudging sidesteps solver backprop

A data-assimilation-inspired approach trains neural network turbulence models on DNS data without embedding them in solvers, reducing computational cost and improving stability.

Source: arxiv/cs.LG · Ashwin Suriyanarayanan, Melissa Adrian, Dibyajyoti Chakraborty, Romit Maulik · open original ↗ ↗
Share: X LinkedIn

Nudging-based training lets neural turbulence closures learn from DNS data without costly solver backpropagation or stability issues.

  • A-posteriori learning embeds neural closures in solvers but requires expensive gradient backpropagation and causes instability.
  • A-priori learning uses DNS data directly but assumes filter properties that don't match actual numerical discretization effects.
  • Continuous data assimilation (nudging) treats DNS as sparse observations and trains closures offline without modifying the solver.
  • Nudging approach avoids adjoints, reduces computational burden, and maintains long-term stability in LES deployments.
  • Model generalizes across different numerical schemes and temporal discretizations better than traditional closure models.
  • No need to embed neural network inside solver, lowering barrier to adoption in existing simulation codes.
  • Addresses mismatch between assumed filter properties and real numerical discretization errors that destabilize standard approaches.

Frequently asked

  • Nudging, or continuous data assimilation, is a technique that treats high-fidelity DNS data as sparse observations and uses a forcing term to guide a coarse-grid model toward those observations. In this work, it allows a neural network closure to learn the required subgrid stress without being embedded inside the LES solver, reducing computational cost and avoiding stability issues from filter mismatch.

Related