An Improved Lower Bound on Support Size of Capacity-Achieving Inputs for the Binomial Channel: Extended version

2026-05-12Information Theory

Information Theory
AI summary

The authors studied a communication model called the binomial channel and focused on the best way to send information through it. They looked at how many different input values (support points) are needed for the optimal distribution that achieves the channel's capacity. Previously, it was known this number was between roughly the square root of the number of trials and half the number of trials. The authors improved the lower limit, showing it must be at least about the square root of n times the logarithm of the logarithm of n. They did this by analyzing the channel capacity precisely, using a special Beta-binomial distribution, and proving that simpler inputs with fewer points can't closely match this optimal output.

Binomial channelChannel capacityDiscrete distributionSupport sizeBeta-binomial distributionRelative entropyχ^2 divergenceCapacity-achieving distributionInformation theoryAsymptotic analysis
Authors
Mohammadamin Baniasadi, Luca Barletta, Alex Dytso
Abstract
We study the binomial channel and the structure of its capacity-achieving input and output distributions. It is known that the capacity-achieving input distribution is discrete and supported on finitely many points. The best previously known bounds show that the support size of the capacity-achieving distribution is lower-bounded by a term of order $\sqrt n$ and upper-bounded by a term of order $n/2$, where $n$ is the number of trials. In this work, we derive a new lower bound on the support size of order $\sqrt{n\log\log n}$, up to explicit constants. The proof consists of three main steps. First, we derive new upper and lower bounds on the capacity with a gap that vanishes as $n\to\infty$, which yields $C(n)=\frac12\log\frac{nπ}{2e}+o(1)$. Second, we show that the Beta-binomial output distribution induced by the reference input $X_r\sim\mathrm{Beta}(1/2,1/2)$ is asymptotically optimal: it approaches the capacity-achieving output distribution in relative entropy and, after a comparison step, in $χ^2$ divergence. Third, we prove a quantitative $χ^2$ approximation lower bound showing that this Beta-binomial output cannot be approximated too well by the output induced by a $K$-point input. Combining these ingredients forces the capacity-achieving input distribution to have at least order $\sqrt{n\log\log n}$ mass points.