Time-series learning of latent-space dynamics for reduced-order model closure

Abstract

We study the performance of long short-term memory networks (LSTMs) and neural ordinarydifferential equations (NODEs) in learning latent-space representations of dynamical equations for anadvection-dominated problem given by the viscous Burgers equation. Our formulation is devised in anonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latterbeing obtained through a proper orthogonal decomposition. In addition, we leverage the sequentialnature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systemsthat are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEsare able to reproduce the effects of the absent scales for our test cases more effectively than doesintrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure asis suggested through the Mori–Zwanzig formalism.

Publication
Physica D: Nonlinear Phenomena

Related