TY - JOUR

T1 - The Mori–Zwanzig formulation of deep learning

AU - Venturi, Daniele

AU - Li, Xiantao

N1 - Funding Information:
Dr. Venturi was partially supported by the U.S. Air Force Office of Scientific Research Grant FA9550-20-1-0174 and by the U.S. Army Research Office Grant W911NF1810309. Dr. Li was supported by the NSF Grant DMS-1953120.
Publisher Copyright:
© 2023, The Author(s).

PY - 2023/6

Y1 - 2023/6

N2 - We develop a new formulation of deep learning based on the Mori–Zwanzig (MZ) formalism of irreversible statistical mechanics. The new formulation is built upon the well-known duality between deep neural networks and discrete dynamical systems, and it allows us to directly propagate quantities of interest (conditional expectations and probability density functions) forward and backward through the network by means of exact linear operator equations. Such new equations can be used as a starting point to develop new effective parameterizations of deep neural networks and provide a new framework to study deep learning via operator-theoretic methods. The proposed MZ formulation of deep learning naturally introduces a new concept, i.e., the memory of the neural network, which plays a fundamental role in low-dimensional modeling and parameterization. By using the theory of contraction mappings, we develop sufficient conditions for the memory of the neural network to decay with the number of layers. This allows us to rigorously transform deep networks into shallow ones, e.g., by reducing the number of neurons per layer (using projection operators), or by reducing the total number of layers (using the decay property of the memory operator).

AB - We develop a new formulation of deep learning based on the Mori–Zwanzig (MZ) formalism of irreversible statistical mechanics. The new formulation is built upon the well-known duality between deep neural networks and discrete dynamical systems, and it allows us to directly propagate quantities of interest (conditional expectations and probability density functions) forward and backward through the network by means of exact linear operator equations. Such new equations can be used as a starting point to develop new effective parameterizations of deep neural networks and provide a new framework to study deep learning via operator-theoretic methods. The proposed MZ formulation of deep learning naturally introduces a new concept, i.e., the memory of the neural network, which plays a fundamental role in low-dimensional modeling and parameterization. By using the theory of contraction mappings, we develop sufficient conditions for the memory of the neural network to decay with the number of layers. This allows us to rigorously transform deep networks into shallow ones, e.g., by reducing the number of neurons per layer (using projection operators), or by reducing the total number of layers (using the decay property of the memory operator).

UR - http://www.scopus.com/inward/record.url?scp=85160077239&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85160077239&partnerID=8YFLogxK

U2 - 10.1007/s40687-023-00390-2

DO - 10.1007/s40687-023-00390-2

M3 - Article

AN - SCOPUS:85160077239

SN - 2522-0144

VL - 10

JO - Research in Mathematical Sciences

JF - Research in Mathematical Sciences

IS - 2

M1 - 23

ER -