Sparse eigen methods by D.C. programming

Bharath K. Sriperumbudur, David A. Torres, Gert R.G. Lanckriet

Research output: Contribution to conferencePaperpeer-review

52 Scopus citations


Eigenvalue problems are rampant in machine learning and statistics and appear in the context of classification, dimensionality reduction, etc. In this paper, we consider a cardinality constrained variational formulation of generalized eigenvalue problem with sparse principal component analysis (PCA) as a special case. Using l"1-norm approximation to the cardinality constraint, previous methods have proposed both convex and non-convex solutions to the sparse PCA problem. In contrast, we propose a tighter approximation that is related to the negative log-likelihood of a Student's t-distribution. The problem is then framed as a d.c. (difference of convex functions) program and is solved as a sequence of locally convex programs. We show that the proposed method not only explains more variance with sparse loadings on the principal directions but also has better scalability compared to other methods. We demonstrate these results on a collection of datasets of varying dimensionality, two of which are high-dimensional gene datasets where the goal is to find few relevant genes that explain as much variance as possible.

Original languageEnglish (US)
Number of pages8
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: Jun 20 2007Jun 24 2007


Other24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications


Dive into the research topics of 'Sparse eigen methods by D.C. programming'. Together they form a unique fingerprint.

Cite this