AI summaryⓘ
The authors investigate how hard it is to find a stable point in a tricky mathematical problem called min-max optimization, where the function involved is neither easy to minimize nor maximize. They focus on functions defined on two d-dimensional boxes and use queries that get function values and gradients. Their main finding is that any method trying to find an approximate stable point must ask an exponentially large number of questions as the desired precision or dimension increases. In other words, solving this kind of problem quickly becomes very difficult as it gets more precise or bigger.
min-max optimizationnonconvex-nonconcavequery complexityoracle accessgradientstationary pointexponential lower boundhigh-dimensional optimization
Authors
Martino Bernasconi, Matteo Castiglioni, Andrea Celli, Alexandros Hollender
Abstract
We study the query complexity of min-max optimization of a nonconvex-nonconcave function $f$ over $[0,1]^d \times [0,1]^d$. We show that, given oracle access to $f$ and to its gradient $\nabla f$, any algorithm that finds an $\varepsilon$-approximate stationary point must make a number of queries that is exponential in $1/\varepsilon$ or $d$.