A weight initialization based on the linear product structure for neural networks

Qipin Chen, Wenrui Hao, Juncai He

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Weight initialization plays an important role in training neural networks and also affects tremendous deep learning applications. Various weight initialization strategies have already been developed for different activation functions with different neural networks. These initialization algorithms are based on minimizing the variance of the parameters between layers and might still fail when neural networks are deep, e.g., dying ReLU. To address this challenge, we study neural networks from a nonlinear computation point of view and propose a novel weight initialization strategy that is based on the linear product structure (LPS) of neural networks. The proposed strategy is derived from the polynomial approximation of activation functions by using theories of numerical algebraic geometry to guarantee to find all the local minima. We also provide a theoretical analysis that the LPS initialization has a lower probability of dying ReLU comparing to other existing initialization strategies. Finally, we test the LPS initialization algorithm on both fully connected neural networks and convolutional neural networks to show its feasibility, efficiency, and robustness on public datasets.

Original languageEnglish (US)
Article number126722
JournalApplied Mathematics and Computation
Volume415
DOIs
StatePublished - Feb 15 2022

All Science Journal Classification (ASJC) codes

  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A weight initialization based on the linear product structure for neural networks'. Together they form a unique fingerprint.

Cite this