TY - JOUR

T1 - Uniform approximation rates and metric entropy of shallow neural networks

AU - Ma, Limin

AU - Siegel, Jonathan W.

AU - Xu, Jinchao

N1 - Funding Information:
We would like to thank Professors Russel Caflisch, Ronald DeVore, Weinan E, Albert Cohen, Stephan Wojtowytsch and Jason Klusowski for helpful discussions. This work was supported by the Verne M. Willaman Chair Fund at the Pennsylvania State University, and the National Science Foundation (Grant No. DMS-1819157 and DMS-2111387).
Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Nature Switzerland AG.

PY - 2022/9

Y1 - 2022/9

N2 - We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of half-spaces, which corresponds to shallow neural networks with sigmoidal activation function. Up to logarithmic factors, we determine the metric entropy and nonlinear dictionary approximation rates for these spaces with respect to the uniform norm. Combined with previous results with respect to the L2-norm, this also gives the metric entropy up to logarithmic factors with respect to any Lp-norm with 1 ≤ p≤ ∞. In addition, we study the approximation rates for high-order spectral Barron spaces using shallow neural networks with ReLUk activation function. Specifically, we show that for a sufficiently high-order spectral Barron space, ReLUk networks are able to achieve an approximation rate of n-(k+1) with respect to the uniform norm.

AB - We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of half-spaces, which corresponds to shallow neural networks with sigmoidal activation function. Up to logarithmic factors, we determine the metric entropy and nonlinear dictionary approximation rates for these spaces with respect to the uniform norm. Combined with previous results with respect to the L2-norm, this also gives the metric entropy up to logarithmic factors with respect to any Lp-norm with 1 ≤ p≤ ∞. In addition, we study the approximation rates for high-order spectral Barron spaces using shallow neural networks with ReLUk activation function. Specifically, we show that for a sufficiently high-order spectral Barron space, ReLUk networks are able to achieve an approximation rate of n-(k+1) with respect to the uniform norm.

UR - http://www.scopus.com/inward/record.url?scp=85134396276&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85134396276&partnerID=8YFLogxK

U2 - 10.1007/s40687-022-00346-y

DO - 10.1007/s40687-022-00346-y

M3 - Article

AN - SCOPUS:85134396276

SN - 2522-0144

VL - 9

JO - Research in Mathematical Sciences

JF - Research in Mathematical Sciences

IS - 3

M1 - 46

ER -