TY - JOUR
T1 - Generalized alternating direction method of multipliers
T2 - new theoretical insights and applications
AU - Fang, Ethan X.
AU - He, Bingsheng
AU - Liu, Han
AU - Yuan, Xiaoming
N1 - Funding Information:
Xiaoming Yuan: This author was supported by the Faculty Research Grant from HKBU: FRG2/13-14/061 and the General Research Fund from Hong Kong Research Grants Council: 203613.
Funding Information:
Bingsheng He: This author was supported by the NSFC Grant 11471156.
Publisher Copyright:
© 2015, Springer-Verlag Berlin Heidelberg and The Mathematical Programming Society.
PY - 2015/6/18
Y1 - 2015/6/18
N2 - Recently, the alternating direction method of multipliers (ADMM) has received intensive attention from a broad spectrum of areas. The generalized ADMM (GADMM) proposed by Eckstein and Bertsekas is an efficient and simple acceleration scheme of ADMM. In this paper, we take a deeper look at the linearized version of GADMM where one of its subproblems is approximated by a linearization strategy. This linearized version is particularly efficient for a number of applications arising from different areas. Theoretically, we show the worst-case $${\mathcal {O}}(1/k)$$O(1/k) convergence rate measured by the iteration complexity ($$k$$k represents the iteration counter) in both the ergodic and a nonergodic senses for the linearized version of GADMM. Numerically, we demonstrate the efficiency of this linearized version of GADMM by some rather new and core applications in statistical learning. Code packages in Matlab for these applications are also developed.
AB - Recently, the alternating direction method of multipliers (ADMM) has received intensive attention from a broad spectrum of areas. The generalized ADMM (GADMM) proposed by Eckstein and Bertsekas is an efficient and simple acceleration scheme of ADMM. In this paper, we take a deeper look at the linearized version of GADMM where one of its subproblems is approximated by a linearization strategy. This linearized version is particularly efficient for a number of applications arising from different areas. Theoretically, we show the worst-case $${\mathcal {O}}(1/k)$$O(1/k) convergence rate measured by the iteration complexity ($$k$$k represents the iteration counter) in both the ergodic and a nonergodic senses for the linearized version of GADMM. Numerically, we demonstrate the efficiency of this linearized version of GADMM by some rather new and core applications in statistical learning. Code packages in Matlab for these applications are also developed.
UR - http://www.scopus.com/inward/record.url?scp=84929327064&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84929327064&partnerID=8YFLogxK
U2 - 10.1007/s12532-015-0078-2
DO - 10.1007/s12532-015-0078-2
M3 - Article
AN - SCOPUS:84929327064
SN - 1867-2949
VL - 7
SP - 149
EP - 187
JO - Mathematical Programming Computation
JF - Mathematical Programming Computation
IS - 2
ER -