TY - GEN
T1 - Non-parametric Greedy Optimization of Parametric Quantum Circuits
AU - Phalak, Koustubh
AU - Ghosh, Swaroop
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The use of Quantum Neural Networks (QNN) that are analogous to classical neural networks, has greatly increased in the past decade owing to the growing interest in the field of Quantum Machine Learning (QML). A QNN consists of three major components: (i) data loading/encoding circuit, (ii) Parametric Quantum Circuit (PQC), and (iii) measurement operations. Under ideal circumstances the PQC of the QNN trains well, however that may not be the case for training under quantum hardware due to presence of different kinds of noise. Deeper QNNs with high depths tend to degrade more in terms of performance compared to shallower networks. This work aims to reduce depth and gate count of PQCs by replacing parametric gates with their approximate fixed non-parametric representations. We propose a greedy algorithm to achieve this such that the algorithm minimizes a distance metric based on unitary transformation matrix of original parametric gate and new set of non-parametric gates. From this greedy optimization followed by a few epochs of re-training, we observe roughly 14% reduction in depth and 48% reduction in gate count at the cost of 3.33# reduction in inferencing accuracy. Similar results are observed for a different dataset as well with different PQC structure.
AB - The use of Quantum Neural Networks (QNN) that are analogous to classical neural networks, has greatly increased in the past decade owing to the growing interest in the field of Quantum Machine Learning (QML). A QNN consists of three major components: (i) data loading/encoding circuit, (ii) Parametric Quantum Circuit (PQC), and (iii) measurement operations. Under ideal circumstances the PQC of the QNN trains well, however that may not be the case for training under quantum hardware due to presence of different kinds of noise. Deeper QNNs with high depths tend to degrade more in terms of performance compared to shallower networks. This work aims to reduce depth and gate count of PQCs by replacing parametric gates with their approximate fixed non-parametric representations. We propose a greedy algorithm to achieve this such that the algorithm minimizes a distance metric based on unitary transformation matrix of original parametric gate and new set of non-parametric gates. From this greedy optimization followed by a few epochs of re-training, we observe roughly 14% reduction in depth and 48% reduction in gate count at the cost of 3.33# reduction in inferencing accuracy. Similar results are observed for a different dataset as well with different PQC structure.
UR - https://www.scopus.com/pages/publications/85194098094
UR - https://www.scopus.com/pages/publications/85194098094#tab=citedBy
U2 - 10.1109/ISQED60706.2024.10528696
DO - 10.1109/ISQED60706.2024.10528696
M3 - Conference contribution
AN - SCOPUS:85194098094
T3 - Proceedings - International Symposium on Quality Electronic Design, ISQED
BT - Proceedings of the 25th International Symposium on Quality Electronic Design, ISQED 2024
PB - IEEE Computer Society
T2 - 25th International Symposium on Quality Electronic Design, ISQED 2024
Y2 - 3 April 2024 through 5 April 2024
ER -