A Smoothed Augmented Lagrangian Framework for Convex Optimization with Nonsmooth Constraints

Peixuan Zhang, Vinayak V. Shanbhag, Ethan X. Fang

Research output: Contribution to journalArticlepeer-review

Abstract

Augmented Lagrangian (AL) methods have proven remarkably useful in solving optimization problems with complicated constraints. The last decade has seen the development of overall complexity guarantees for inexact AL variants. Yet, a crucial gap persists in addressing nonsmooth convex constraints. To this end, we present a smoothed augmented Lagrangian (AL) framework where nonsmooth terms are progressively smoothed with a smoothing parameter ηk. The resulting AL subproblems are ηk-smooth, allowing for leveraging accelerated schemes. By a careful selection of the inexactness level ϵk (for inexact subproblem resolution), the penalty parameter ρk, and smoothing parameter ηk at epoch k, we derive rate and complexity guarantees of O~(1/ε3/2) and O~(1/ε) in convex and strongly convex regimes for computing an ε-optimal solution, when ρk increases at a geometric rate, a significant improvement over the best available guarantees for AL schemes for convex programs with nonsmooth constraints. Analogous guarantees are developed for settings with ρk=ρ as well as ηk=η. Preliminary numerics on a fused Lasso problem display promise.

Original languageEnglish (US)
Article number46
JournalJournal of Scientific Computing
Volume104
Issue number2
DOIs
StatePublished - Aug 2025

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Numerical Analysis
  • General Engineering
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A Smoothed Augmented Lagrangian Framework for Convex Optimization with Nonsmooth Constraints'. Together they form a unique fingerprint.

Cite this