TY - JOUR
T1 - Random coordinate descent
T2 - A simple alternative for optimizing parameterized quantum circuits
AU - Ding, Zhiyan
AU - Ko, Taehee
AU - Yao, Jiahao
AU - Lin, Lin
AU - Li, Xiantao
N1 - Publisher Copyright:
© 2024 authors. Published by the American Physical Society. Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
PY - 2024/6
Y1 - 2024/6
N2 - Variational quantum algorithms rely on the optimization of parameterized quantum circuits in noisy settings. The commonly used back-propagation procedure in classical machine learning is not directly applicable in this setting due to the collapse of quantum states after measurements. Thus, gradient estimations constitute a significant overhead in a gradient-based optimization of such quantum circuits. This paper introduces a random coordinate descent algorithm as a practical and easy-to-implement alternative to the full gradient descent algorithm. This algorithm only requires one partial derivative at each iteration. Motivated by the behavior of measurement noise in the practical optimization of parameterized quantum circuits, this paper presents an optimization problem setting that is amenable to analysis. Under this setting, the random coordinate descent algorithm exhibits the same level of stochastic stability as the full gradient approach, making it as resilient to noise. The complexity of the random coordinate descent method is generally no worse than that of the gradient descent and can be much better for various quantum optimization problems with anisotropic Lipschitz constants. Theoretical analysis and extensive numerical experiments validate our findings.
AB - Variational quantum algorithms rely on the optimization of parameterized quantum circuits in noisy settings. The commonly used back-propagation procedure in classical machine learning is not directly applicable in this setting due to the collapse of quantum states after measurements. Thus, gradient estimations constitute a significant overhead in a gradient-based optimization of such quantum circuits. This paper introduces a random coordinate descent algorithm as a practical and easy-to-implement alternative to the full gradient descent algorithm. This algorithm only requires one partial derivative at each iteration. Motivated by the behavior of measurement noise in the practical optimization of parameterized quantum circuits, this paper presents an optimization problem setting that is amenable to analysis. Under this setting, the random coordinate descent algorithm exhibits the same level of stochastic stability as the full gradient approach, making it as resilient to noise. The complexity of the random coordinate descent method is generally no worse than that of the gradient descent and can be much better for various quantum optimization problems with anisotropic Lipschitz constants. Theoretical analysis and extensive numerical experiments validate our findings.
UR - http://www.scopus.com/inward/record.url?scp=85198236039&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85198236039&partnerID=8YFLogxK
U2 - 10.1103/PhysRevResearch.6.033029
DO - 10.1103/PhysRevResearch.6.033029
M3 - Article
AN - SCOPUS:85198236039
SN - 2643-1564
VL - 6
JO - Physical Review Research
JF - Physical Review Research
IS - 3
M1 - 033029
ER -