Abstract
A global optimization technique is introduced for statistical classifier design to minimize the probability of classification error. The method, which is based on ideas from information theory and analogies to statistical physics, is inherently probabilistic. During the design phase, data are assigned to classes in probability, with the probability distributions chosen to maximize entropy subject to a constraint on the expected classification error. This entropy maximization problem is seen to be equivalent to a free energy minimization, motivating a deterministic annealing approach to minimize the misclassification cost. Our method is applicable to a variety of classifier structures, including nearest prototype, radial basis function, and multilayer perceptron-based classifiers. On standard benchmark examples, the method applied to nearest prototype classifier design achieves performance improvements over both the learning vector quantizer, as well as over multilayer perceptron classifiers designed by the standard back-propagation algorithm. Remarkably substantial performance gains over learning vector quantization are achieved for complicated mixture examples where there is significant class overlap.
Original language | English (US) |
---|---|
Pages | 58-66 |
Number of pages | 9 |
State | Published - Jan 1 1995 |
Event | Proceedings of the 5th IEEE Workshop on Neural Networks for Signal Processing (NNSP'95) - Cambridge, MA, USA Duration: Aug 31 1995 → Sep 2 1995 |
Other
Other | Proceedings of the 5th IEEE Workshop on Neural Networks for Signal Processing (NNSP'95) |
---|---|
City | Cambridge, MA, USA |
Period | 8/31/95 → 9/2/95 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Software
- Electrical and Electronic Engineering