Sharp Bounds on the Approximation Rates, Metric Entropy, and n-Widths of Shallow Neural Networks

Jonathan W. Siegel, Jinchao Xu

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

In this article, we study approximation properties of the variation spaces corresponding to shallow neural networks with a variety of activation functions. We introduce two main tools for estimating the metric entropy, approximation rates, and n-widths of these spaces. First, we introduce the notion of a smoothly parameterized dictionary and give upper bounds on the nonlinear approximation rates, metric entropy, and n-widths of their absolute convex hull. The upper bounds depend upon the order of smoothness of the parameterization. This result is applied to dictionaries of ridge functions corresponding to shallow neural networks, and they improve upon existing results in many cases. Next, we provide a method for lower bounding the metric entropy and n-widths of variation spaces which contain certain classes of ridge functions. This result gives sharp lower bounds on the L2-approximation rates, metric entropy, and n-widths for variation spaces corresponding to neural networks with a range of important activation functions, including ReLUk activation functions and sigmoidal activation functions with bounded variation.

Original languageEnglish (US)
Pages (from-to)481-537
Number of pages57
JournalFoundations of Computational Mathematics
Volume24
Issue number2
DOIs
StatePublished - Apr 2024

All Science Journal Classification (ASJC) codes

  • Analysis
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Sharp Bounds on the Approximation Rates, Metric Entropy, and n-Widths of Shallow Neural Networks'. Together they form a unique fingerprint.

Cite this