Trainability Barriers in Low-Depth QAOA Landscapes
Autor: | Rajakumar, Joel, Golden, John, Bärtschi, Andreas, Eidenbenz, Stephan |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Zdroj: | 21st ACM International Conference on Computing Frontiers CF'24, pages 199-206, May 2024 |
Druh dokumentu: | Working Paper |
DOI: | 10.1145/3649153.3649204 |
Popis: | The Quantum Alternating Operator Ansatz (QAOA) is a prominent variational quantum algorithm for solving combinatorial optimization problems. Its effectiveness depends on identifying input parameters that yield high-quality solutions. However, understanding the complexity of training QAOA remains an under-explored area. Previous results have given analytical performance guarantees for a small, fixed number of parameters. At the opposite end of the spectrum, barren plateaus are likely to emerge at $\Omega(n)$ parameters for $n$ qubits. In this work, we study the difficulty of training in the intermediate regime, which is the focus of most current numerical studies and near-term hardware implementations. Through extensive numerical analysis of the quality and quantity of local minima, we argue that QAOA landscapes can exhibit a superpolynomial growth in the number of low-quality local minima even when the number of parameters scales logarithmically with $n$. This means that the common technique of gradient descent from randomly initialized parameters is doomed to fail beyond small $n$, and emphasizes the need for good initial guesses of the optimal parameters. Comment: minor updates |
Databáze: | arXiv |
Externí odkaz: |