Margin-maximizing feature elimination methods for linear and nonlinear kernel-based discriminant functions

Research output: Contribution to journalArticlepeer-review

45 Scopus citations

Abstract

Feature selection for classification in high-dimensional spaces can improve generalization, reduce classifier complexity, and identify important, discriminating feature "markers." For support vector machine (SVM) classification, a widely used technique is recursive feature elimination (RFE). We demonstrate that RFE is not consistent with margin maximization, central to the SVM learning approach. We thus propose explicit margin-based feature elimination (MFE) for SVMs and demonstrate both improved margin and improved generalization, compared with RFE. Moreover, for the case of a nonlinear kernel, we show that RFE assumes that the squared weight vector 2-norm is strictly decreasing as features are eliminated. We demonstrate this is not true for the Gaussian kernel and, consequently, RFE may give poor results in this case. MFE for nonlinear kernels gives better margin and generalization. We also present an extension which achieves further margin gains, by optimizing only two degrees of freedomthe hyperplane's intercept and its squared 2-normwith the weight vector orientation fixed. We finally introduce an extension that allows margin slackness. We compare against several alternatives, including RFE and a linear programming method that embeds feature selection within the classifier design. On high-dimensional gene microarray data sets, University of California at Irvine (UCI) repository data sets, and Alzheimer's disease brain image data, MFE methods give promising results.

Original languageEnglish (US)
Article number5419999
Pages (from-to)701-717
Number of pages17
JournalIEEE Transactions on Neural Networks
Volume21
Issue number5
DOIs
StatePublished - May 2010

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Margin-maximizing feature elimination methods for linear and nonlinear kernel-based discriminant functions'. Together they form a unique fingerprint.

Cite this