TY - JOUR
T1 - Time-series learning of latent-space dynamics for reduced-order model closure
AU - Maulik, Romit
AU - Mohan, Arvind
AU - Lusch, Bethany
AU - Madireddy, Sandeep
AU - Balaprakash, Prasanna
AU - Livescu, Daniel
N1 - Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2020/4
Y1 - 2020/4
N2 - We study the performance of long short-term memory networks (LSTMs) and neural ordinary differential equations (NODEs) in learning latent-space representations of dynamical equations for an advection-dominated problem given by the viscous Burgers equation. Our formulation is devised in a nonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latter being obtained through a proper orthogonal decomposition. In addition, we leverage the sequential nature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systems that are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEs are able to reproduce the effects of the absent scales for our test cases more effectively than does intrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure as is suggested through the Mori–Zwanzig formalism.
AB - We study the performance of long short-term memory networks (LSTMs) and neural ordinary differential equations (NODEs) in learning latent-space representations of dynamical equations for an advection-dominated problem given by the viscous Burgers equation. Our formulation is devised in a nonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latter being obtained through a proper orthogonal decomposition. In addition, we leverage the sequential nature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systems that are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEs are able to reproduce the effects of the absent scales for our test cases more effectively than does intrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure as is suggested through the Mori–Zwanzig formalism.
UR - http://www.scopus.com/inward/record.url?scp=85079159915&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85079159915&partnerID=8YFLogxK
U2 - 10.1016/j.physd.2020.132368
DO - 10.1016/j.physd.2020.132368
M3 - Article
AN - SCOPUS:85079159915
SN - 0167-2789
VL - 405
JO - Physica D: Nonlinear Phenomena
JF - Physica D: Nonlinear Phenomena
M1 - 132368
ER -