Abstract
We study the performance of long short-term memory networks (LSTMs) and neural ordinary differential equations (NODEs) in learning latent-space representations of dynamical equations for an advection-dominated problem given by the viscous Burgers equation. Our formulation is devised in a nonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latter being obtained through a proper orthogonal decomposition. In addition, we leverage the sequential nature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systems that are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEs are able to reproduce the effects of the absent scales for our test cases more effectively than does intrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure as is suggested through the Mori–Zwanzig formalism.
Original language | English (US) |
---|---|
Article number | 132368 |
Journal | Physica D: Nonlinear Phenomena |
Volume | 405 |
DOIs | |
State | Published - Apr 2020 |
All Science Journal Classification (ASJC) codes
- Statistical and Nonlinear Physics
- Mathematical Physics
- Condensed Matter Physics
- Applied Mathematics