Accelerated stochastic block coordinate gradient descent for sparsity constrained nonconvex optimization

Jinghui Chen, Quanquan Gu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations

Abstract

We propose an accelerated stochastic block coordinate descent algorithm for nonconvex optimization under sparsity constraint in the high dimensional regime. The core of our algorithm is leveraging both stochastic partial gradient and full partial gradient restricted to each coordinate block to accelerate the convergence. We prove that the algorithm converges to the unknown true parameter at a linear rate, up to the statistical error of the underlying model. Experiments on both synthetic and real datasets backup our theory.

Original languageEnglish (US)
Title of host publication32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016
EditorsDominik Janzing, Alexander Ihler
PublisherAssociation For Uncertainty in Artificial Intelligence (AUAI)
Pages132-141
Number of pages10
ISBN (Electronic)9781510827806
StatePublished - 2016
Event32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016 - Jersey City, United States
Duration: Jun 25 2016Jun 29 2016

Publication series

Name32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016

Other

Other32nd Conference on Uncertainty in Artificial Intelligence 2016, UAI 2016
Country/TerritoryUnited States
CityJersey City
Period6/25/166/29/16

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Accelerated stochastic block coordinate gradient descent for sparsity constrained nonconvex optimization'. Together they form a unique fingerprint.

Cite this