Classical and Quantum Speedups for Non-Convex Optimization via Energy Conserving Descent

2026-04-14Machine Learning

Machine Learning
AI summary

The authors study a new optimization method called Energy Conserving Descent (ECD), which aims to find the best solution in tricky problems better than traditional gradient descent. They analyze two versions: a stochastic one (sECD) that uses noise but keeps energy balanced, and a quantum version (qECD) inspired by quantum physics. For a simple test case with two valleys, they show both versions can find the global best spot much faster than usual methods, and the quantum version is even faster when there are big obstacles. This work lays groundwork for using quantum computers to solve tough optimization problems.

Energy Conserving Descent (ECD)Stochastic dynamicsQuantum HamiltonianGlobal optimizationGradient descentHitting timeDouble-well potentialQuantum algorithmHamiltonian simulationExponential speedup
Authors
Yihang Sun, Huaijin Wang, Patrick Hayden, Jose Blanchet
Abstract
The Energy Conserving Descent (ECD) algorithm was recently proposed (De Luca & Silverstein, 2022) as a global non-convex optimization method. Unlike gradient descent, appropriately configured ECD dynamics escape strict local minima and converge to a global minimum, making it appealing for machine learning optimization. We present the first analytical study of ECD, focusing on the one-dimensional setting for this first installment. We formalize a stochastic ECD dynamics (sECD) with energy-preserving noise, as well as a quantum analog of the ECD Hamiltonian (qECD), providing the foundation for a quantum algorithm through Hamiltonian simulation. For positive double-well objectives, we compute the expected hitting time from a local to the global minimum. We prove that both sECD and qECD yield exponential speedup over respective gradient descent baselines--stochastic gradient descent and its quantization. For objectives with tall barriers, qECD achieves a further speedup over sECD.