Two-Sided Bounds for Entropic Optimal Transport via a Rate-Distortion Integral
2026-04-15 • Information Theory
Information Theory
AI summaryⓘ
The authors study how to maximize the average overlap (inner product) between two random vectors when there's a limit on how much information they can share. They show this maximum is closely related to a specific integral involving the rate-distortion function, which measures the trade-off between information rate and quality of approximation. To prove this, they use a special technique that builds a complex Gaussian process based on certain subsets of probability distributions, then apply an advanced mathematical tool called the majorizing measure theorem. Their work connects ideas from probability, information theory, and functional analysis.
mutual informationinner productrate-distortion functionGaussian processmajorizing measure theoreminformation theoryprobability distributioncouplinglifting technique
Authors
Jingbo Liu
Abstract
We show that the maximum expected inner product between a random vector and the standard normal vector over all couplings subject to a mutual information constraint or regularization is equivalent to a truncated integral involving the rate-distortion function, up to universal multiplicative constants. The proof is based on a lifting technique, which constructs a Gaussian process indexed by a random subset of the type class of the probability distribution involved in the information-theoretic inequality, and then applying a form of the majorizing measure theorem.