Robustness of Quantum Algorithms for Nonconvex Optimization

Authors: Weiyuan Gong, Chenyi Zhang, Tongyang Li

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we systematically study quantum algorithms for finding an ϵ-approximate second-order stationary point (ϵ-SOSP) of a d-dimensional nonconvex function, a fundamental problem in nonconvex optimization, with noisy zerothor first-order oracles as inputs. We first prove that, up to noise of O(ϵ10/d5), perturbed accelerated gradient descent equipped with quantum gradient estimation takes O(log d/ϵ1.75) quantum queries to find an ϵ-SOSP. We then prove that standard perturbed gradient descent is robust to the noise of O(ϵ6/d4) and O(ϵ/d0.5+ζ) for any ζ > 0 on the zerothand first-order oracles, respectively, which provides a quantum algorithm with poly-logarithmic query complexity. Furthermore, we propose a stochastic gradient descent algorithm using quantum mean estimation on the Gaussian smoothing of noisy oracles, which is robust to O(ϵ1.5/d) and O(ϵ/d) noise on the zerothand first-order oracles, respectively. The quantum algorithm takes O(d2.5/ϵ3.5) and O(d2/ϵ3) queries to the two oracles, giving a polynomial speedup over the classical counterparts. As a complement, we characterize the domains where quantum algorithms can find an ϵ-SOSP with poly-logarithmic, polynomial, or exponential number of queries in d, or the problem is information-theoretically unsolvable even with an infinite number of queries. In addition, we prove an Ω(ϵ 12/7) lower bound on ϵ for any randomized classical and quantum algorithm to find an ϵ-SOSP using either noisy zerothor first-order oracles.
Researcher Affiliation Academia Weiyuan Gong School of Engineering and Applied Science Harvard University Allston, MA 02134 EMAIL, Chenyi Zhang Computer Science Department Stanford University Stanford, CA 94305 EMAIL, Tongyang Li Center on Frontiers of Computing Studies and School of Computer Science Peking University Beijing, 100871 EMAIL
Pseudocode Yes Our quantum perturbed accelerated gradient descent (PAGD) algorithm (see Algorithm 1 in the supplementary material for details) replaces the gradient queries in Perturbed Accelerated Gradient Descent (Zhang & Li, 2021; Zhang & Gu, 2022) with Jordan s gradient estimation.
Open Source Code No The paper does not contain any explicit statement about releasing code or a link to a code repository.
Open Datasets No The paper is theoretical and focuses on algorithm design, proofs, and lower bounds for quantum algorithms in nonconvex optimization. It does not conduct empirical studies that would involve specific datasets or provide access information for any.
Dataset Splits No The paper is theoretical and does not conduct experiments on datasets, therefore, it does not specify any dataset splits.
Hardware Specification No The paper is theoretical and focuses on algorithms, proofs, and lower bounds. It does not describe any experimental hardware used.
Software Dependencies No The paper is theoretical and focuses on algorithms and proofs. It does not mention any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and focuses on algorithms and proofs. It does not provide specific experimental setup details such as hyperparameters or training configurations.