Abstract
Variable selection via penalized likelihood has received considerable attention recently. Penalized likelihood estimators with properly chosen penalty functions possess nice properties. In practice, optimizing the penalized likelihood function is often challenging because the object function may be nondiffer-entiable and/or nonconcave. Existing algorithms such as the local quadratic approximation (LQA) algorithm share a similar drawback of backward selection that once a variable is deleted, it is essentially left out of the final model. We propose the iterative conditional maximization (ICM) algorithm to address the aforementioned drawback. It utilizes the characteristics of the nonconcave penalized likelihood and enjoys fast convergence. Three simulation studies, in linear, logistic, and Poisson regression, together with one real data analysis, are conducted to assess the performance of the ICM algorithm.
Original language | English (US) |
---|---|
Title of host publication | Nonparametric Statistics and Mixture Models |
Subtitle of host publication | A Festschrift in Honor of Thomas P Hettmansperger |
Publisher | World Scientific Publishing Co. |
Pages | 336-351 |
Number of pages | 16 |
ISBN (Electronic) | 9789814340564 |
ISBN (Print) | 9814340553, 9789814340557 |
DOIs | |
State | Published - Jan 1 2011 |
All Science Journal Classification (ASJC) codes
- General Mathematics