TY - GEN
T1 - Quantum Algorithms for Sampling Log-Concave Distributions and Estimating Normalizing Constants
AU - Childs, Andrew M.
AU - Li, Tongyang
AU - Liu, Jin Peng
AU - Wang, Chunhao
AU - Zhang, Ruizhe
N1 - Publisher Copyright:
© 2022 Neural information processing systems foundation. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Given a convex function f : Rd → R, the problem of sampling from a distribution (Equation presented) is called log-concave sampling. This task has wide applications in machine learning, physics, statistics, etc. In this work, we develop quantum algorithms for sampling log-concave distributions and for estimating their normalizing constants(Equation presented). First, we use underdamped Langevin diffusion to develop quantum algorithms that match the query complexity (in terms of the condition number κ and dimension d) of analogous classical algorithms that use gradient (first-order) queries, even though the quantum algorithms use only evaluation (zeroth-order) queries. For estimating normalizing constants, these algorithms also achieve quadratic speedup in the multiplicative error ∊. Second, we develop quantum Metropolis-adjusted Langevin algorithms with query complexity (Equation presented) and (Equation presented) for log-concave sampling and normalizing constant estimation, respectively, achieving polynomial speedups in κ, d, ∊ over the best known classical algorithms by exploiting quantum analogs of the Monte Carlo method and quantum walks. We also prove a (Equation presented) quantum lower bound for estimating normalizing constants, implying near-optimality of our quantum algorithms in ∊.
AB - Given a convex function f : Rd → R, the problem of sampling from a distribution (Equation presented) is called log-concave sampling. This task has wide applications in machine learning, physics, statistics, etc. In this work, we develop quantum algorithms for sampling log-concave distributions and for estimating their normalizing constants(Equation presented). First, we use underdamped Langevin diffusion to develop quantum algorithms that match the query complexity (in terms of the condition number κ and dimension d) of analogous classical algorithms that use gradient (first-order) queries, even though the quantum algorithms use only evaluation (zeroth-order) queries. For estimating normalizing constants, these algorithms also achieve quadratic speedup in the multiplicative error ∊. Second, we develop quantum Metropolis-adjusted Langevin algorithms with query complexity (Equation presented) and (Equation presented) for log-concave sampling and normalizing constant estimation, respectively, achieving polynomial speedups in κ, d, ∊ over the best known classical algorithms by exploiting quantum analogs of the Monte Carlo method and quantum walks. We also prove a (Equation presented) quantum lower bound for estimating normalizing constants, implying near-optimality of our quantum algorithms in ∊.
UR - http://www.scopus.com/inward/record.url?scp=85163145395&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163145395&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85163145395
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
A2 - Koyejo, S.
A2 - Mohamed, S.
A2 - Agarwal, A.
A2 - Belgrave, D.
A2 - Cho, K.
A2 - Oh, A.
PB - Neural information processing systems foundation
T2 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
Y2 - 28 November 2022 through 9 December 2022
ER -