Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier

David J. Miller, Hasan S. Uyar

Research output: Contribution to journalArticlepeer-review

25 Scopus citations


We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.

Original languageEnglish (US)
Pages (from-to)281-293
Number of pages13
JournalNeural computation
Issue number2
StatePublished - Feb 15 1998

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier'. Together they form a unique fingerprint.

Cite this