The Optimal Sample Complexity of Multiclass and List Learning
2026-04-27 • Machine Learning
Machine Learning
AI summaryⓘ
The authors address a long-standing question about how many examples a machine needs to learn to classify multiple categories correctly. They focus on a complexity measure called the DS dimension, which is known to be important for multiclass classification. By building on recent work, the authors show that the maximum density of certain mathematical structures linked to the hypothesis class is limited by the DS dimension. This result confirms a previous conjecture and helps pinpoint the exact sample size needed for learning in multiclass and list learning problems.
VC dimensionDS dimensionmulticlass classificationsample complexityhypothesis classhypergraph densitylist learningalgebraic characterizationlearning theoryDaniely and Shalev-Shwartz conjecture
Authors
Chirag Pabbaraju
Abstract
While the optimal sample complexity of binary classification in terms of the VC dimension is well-established, determining the optimal sample complexity of multiclass classification has remained open. The appropriate complexity parameter for multiclass classification is the DS dimension, and despite significant efforts, a gap of $\sqrt{\text{DS}}$ has persisted between the upper and lower bounds on sample complexity. Recent work by Hanneke et al. (2026) shows a novel algebraic characterization of multiclass hypothesis classes in terms of their DS dimension. Building up on this, we show that the maximum hypergraph density of any multiclass hypothesis class is upper-bounded by its DS dimension. This proves a longstanding conjecture of Daniely and Shalev-Shwartz (2014). As a consequence, we determine the optimal dependence of the sample complexity on the DS dimension for multiclass as well as list learning.