TY - JOUR
T1 - Constructive neural-network learning algorithms for pattern classification
AU - Parekh, Rajesh
AU - Yang, Jihoon
AU - Honavar, Vasant
N1 - Funding Information:
Manuscript received May 6, 1997; revised October 29, 1998 and October 28, 1999. This work was supported in part by the National Science Foundation Grants IRI-9409580 and IRI-9643299. The work of V. Honavar was funded in part by grants from the National Science Foundation, the John Deere Foundation, the National Security Agency, and IBM R. Parekh is with Allstate Research and Planning Center, Menlo Park CA 94025 USA (e-mail: [email protected]). J. Yang is with Information Sciences Lab, HRL Laboratories LLC, Malibu CA 90265 USA (e-mail: [email protected]). V. Honavar is with the Department of Computer Science, Iowa State University, Ames, IA 50011 USA (e-mail: [email protected]). Publisher Item Identifier S 1045-9227(00)02997-0.
PY - 2000
Y1 - 2000
N2 - Constructive learning algorithms offer an attractive approach for the incremental construction of near-minimal neural-network architectures for pattern classification. They help overcome the need for ad hoc and often inappropriate choices of network topology in algorithms that search for suitable weights in a priori fixed network architectures. Several such algorithms are proposed in the literature and shown to converge to zero classification errors (under certain assumptions) on tasks that involve learning a binary to binary mapping (i.e., classification problems involving binary-valued input attributes and two output categories). We present two constructive learning algorithms MPyramid-real and MTiling-real that extend the pyramid and tiling algorithms, respectively, for learning real to M-ary mappings (i.e., classification problems involving real-valued input attributes and multiple output classes). We prove the convergence of these algorithms and empirically demonstrate their applicability to practical pattern classification problems. Additionally, we show how the incorporation of a local pruning step can eliminate several redundant neurons from MTiling-real networks.
AB - Constructive learning algorithms offer an attractive approach for the incremental construction of near-minimal neural-network architectures for pattern classification. They help overcome the need for ad hoc and often inappropriate choices of network topology in algorithms that search for suitable weights in a priori fixed network architectures. Several such algorithms are proposed in the literature and shown to converge to zero classification errors (under certain assumptions) on tasks that involve learning a binary to binary mapping (i.e., classification problems involving binary-valued input attributes and two output categories). We present two constructive learning algorithms MPyramid-real and MTiling-real that extend the pyramid and tiling algorithms, respectively, for learning real to M-ary mappings (i.e., classification problems involving real-valued input attributes and multiple output classes). We prove the convergence of these algorithms and empirically demonstrate their applicability to practical pattern classification problems. Additionally, we show how the incorporation of a local pruning step can eliminate several redundant neurons from MTiling-real networks.
UR - http://www.scopus.com/inward/record.url?scp=0033742041&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0033742041&partnerID=8YFLogxK
U2 - 10.1109/72.839013
DO - 10.1109/72.839013
M3 - Article
C2 - 18249773
AN - SCOPUS:0033742041
SN - 1045-9227
VL - 11
SP - 436
EP - 451
JO - IEEE Transactions on Neural Networks
JF - IEEE Transactions on Neural Networks
IS - 2
ER -