## Abstract

We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||_{1}: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and A^{T} y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Ax_{k} - b ||_{2} ≤ e and e-optimal; i.e. | ||x_{k} ||1 - ||x^{*}||1| ≤ ∈ after Õ (∈^{-2/3}) iterations. SPA is able to work with l _{1}, l_{2}, or l_{∞} penalty on the infeasibility, and SPA can be easily extended to solve the relaxed recovery problem min{||x||_{1}: ||Ax - b||2 ≤θ}.

Original language | English (US) |
---|---|

Pages (from-to) | 287-313 |

Number of pages | 27 |

Journal | SIAM Journal on Optimization |

Volume | 21 |

Issue number | 1 |

DOIs | |

State | Published - 2011 |

## All Science Journal Classification (ASJC) codes

- Software
- Theoretical Computer Science
- Applied Mathematics