Abstract
Stochastic gradient methods are scalable for solving large-scale optimization problems that involve empirical expectations of loss functions. Existing results mainly apply to optimization problems where the objectives are one- or two-level expectations. In this paper, we consider the multilevel composition optimization problem that involves compositions of multilevel component functions and nested expectations over a random path. This finds applications in risk-averse optimization and sequential planning. We propose a class of multilevel stochastic gradient methods that are motivated by the method of multitimescale stochastic approximation. First, we propose a basic T -level stochastic compositional gradient algorithm. Then we develop accelerated multilevel stochastic gradient methods by using an extrapolation-interpolation scheme to take advantage of the smoothness of individual component functions. When all component functions are smooth, we show that the convergence rate improves to O(n-4=(7+T )) for general objectives and O(n-4=(3+T )) for strongly convex objectives. We also provide almost sure convergence and rate of convergence results for nonconvex problems. The proposed methods and theoretical results are validated using numerical experiments.
Original language | English (US) |
---|---|
Pages (from-to) | 616-659 |
Number of pages | 44 |
Journal | SIAM Journal on Optimization |
Volume | 29 |
Issue number | 1 |
DOIs | |
State | Published - 2019 |
All Science Journal Classification (ASJC) codes
- Software
- Theoretical Computer Science