TY - JOUR
T1 - Differentiable turbulence
T2 - Closure as a partial differential equation constrained optimization
AU - Shankar, Varun
AU - Chakraborty, Dibyajyoti
AU - Viswanathan, Venkatasubramanian
AU - Maulik, Romit
N1 - Publisher Copyright:
© 2025 American Physical Society.
PY - 2025/2
Y1 - 2025/2
N2 - Deep learning is increasingly becoming a promising pathway to improving the accuracy of subgrid scale (SGS) turbulence closure models for large-eddy simulations (LESs). We leverage the concept of differentiable turbulence, whereby an end-to-end differentiable solver is used in combination with physics-inspired choices of deep learning architectures to learn highly effective and versatile SGS models for two-dimensional turbulent flow. We perform an in-depth analysis of the inductive biases in the chosen architectures, finding that the inclusion of small-scale nonlocal features is most critical to effective SGS modeling, while large-scale features can improve pointwise accuracy of the a posteriori solution field. The velocity gradient tensor on the LES grid can be mapped directly to the SGS stress via decomposition of the inputs and outputs into isotropic, deviatoric, and antisymmetric components. We see that the model can generalize to a variety of flow configurations, including higher and lower Reynolds numbers and different forcing conditions. We also demonstrate the use of ensemble uncertainty quantification for characterizing the impact of data-driven closures on time-evolving resolved flow fields. We show that the differentiable physics paradigm is more successful than offline, a priori learning, and that hybrid solver-in-the-loop approaches to deep learning offer an ideal balance between computational efficiency, accuracy, and generalization. Our experiments provide physics-based recommendations for deep-learning-based SGS modeling for generalizable closure modeling of turbulence.
AB - Deep learning is increasingly becoming a promising pathway to improving the accuracy of subgrid scale (SGS) turbulence closure models for large-eddy simulations (LESs). We leverage the concept of differentiable turbulence, whereby an end-to-end differentiable solver is used in combination with physics-inspired choices of deep learning architectures to learn highly effective and versatile SGS models for two-dimensional turbulent flow. We perform an in-depth analysis of the inductive biases in the chosen architectures, finding that the inclusion of small-scale nonlocal features is most critical to effective SGS modeling, while large-scale features can improve pointwise accuracy of the a posteriori solution field. The velocity gradient tensor on the LES grid can be mapped directly to the SGS stress via decomposition of the inputs and outputs into isotropic, deviatoric, and antisymmetric components. We see that the model can generalize to a variety of flow configurations, including higher and lower Reynolds numbers and different forcing conditions. We also demonstrate the use of ensemble uncertainty quantification for characterizing the impact of data-driven closures on time-evolving resolved flow fields. We show that the differentiable physics paradigm is more successful than offline, a priori learning, and that hybrid solver-in-the-loop approaches to deep learning offer an ideal balance between computational efficiency, accuracy, and generalization. Our experiments provide physics-based recommendations for deep-learning-based SGS modeling for generalizable closure modeling of turbulence.
UR - http://www.scopus.com/inward/record.url?scp=85219335560&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85219335560&partnerID=8YFLogxK
U2 - 10.1103/PhysRevFluids.10.024605
DO - 10.1103/PhysRevFluids.10.024605
M3 - Article
AN - SCOPUS:85219335560
SN - 2469-990X
VL - 10
JO - Physical Review Fluids
JF - Physical Review Fluids
IS - 2
M1 - 024605
ER -