TY - JOUR
T1 - Unsupervised learning of parsimonious mixtures on large spaces with integrated feature and component selection
AU - Graham, Michael W.
AU - Miller, David J.
N1 - Funding Information:
Manuscript received November 24, 2004; revised May 17, 2005. This work was supported in part by National Science Foundation Award IIS-0082214 and by a DoD graduate fellowship. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Sergios Theoridis.
PY - 2006/4
Y1 - 2006/4
N2 - Estimating the number of components (the order) in a mixture model is often addressed using criteria such as the Bayesian information criterion (BIC) and minimum message length. However, when the feature space is very large, use of these criteria may grossly underestimate the order. Here, it is suggested that this failure is not mainly attributable to the criterion (e.g., BIC), but rather to the lack of "structure" in standard mixtures - these models trade off data fitness and model complexity only by varying the order. The authors of the present paper propose mixtures with a richer set of tradeoffs. The proposed model allows each component its own informative feature subset, with all other features explained by a common model (shared by all components). Parameter sharing greatly reduces complexity at a given order. Since the space of these parsimonious modeling solutions is vast, this space is searched in an efficient manner, integrating the component and feature selection within the generalized expectation-maximization (GEM) learning for the mixture parameters. The quality of the proposed (unsupervised) solutions is evaluated using both classification error and test set data likelihood. On text data, the proposed multinomial version - learned without labeled examples, without knowing the "true" number of topics, and without feature preprocessing - compares quite favorably with both alternative unsupervised methods and with a supervised naive Bayes classifier. A Gaussian version compares favorably with a recent method introducing "feature saliency" in mixtures.
AB - Estimating the number of components (the order) in a mixture model is often addressed using criteria such as the Bayesian information criterion (BIC) and minimum message length. However, when the feature space is very large, use of these criteria may grossly underestimate the order. Here, it is suggested that this failure is not mainly attributable to the criterion (e.g., BIC), but rather to the lack of "structure" in standard mixtures - these models trade off data fitness and model complexity only by varying the order. The authors of the present paper propose mixtures with a richer set of tradeoffs. The proposed model allows each component its own informative feature subset, with all other features explained by a common model (shared by all components). Parameter sharing greatly reduces complexity at a given order. Since the space of these parsimonious modeling solutions is vast, this space is searched in an efficient manner, integrating the component and feature selection within the generalized expectation-maximization (GEM) learning for the mixture parameters. The quality of the proposed (unsupervised) solutions is evaluated using both classification error and test set data likelihood. On text data, the proposed multinomial version - learned without labeled examples, without knowing the "true" number of topics, and without feature preprocessing - compares quite favorably with both alternative unsupervised methods and with a supervised naive Bayes classifier. A Gaussian version compares favorably with a recent method introducing "feature saliency" in mixtures.
UR - http://www.scopus.com/inward/record.url?scp=33645283097&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33645283097&partnerID=8YFLogxK
U2 - 10.1109/TSP.2006.870586
DO - 10.1109/TSP.2006.870586
M3 - Article
AN - SCOPUS:33645283097
SN - 1053-587X
VL - 54
SP - 1289
EP - 1303
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 4
ER -