High-Dimensional Signal Compression: Lattice Point Bounds and Metric Entropy

2026-04-03Information Theory

Information Theory
AI summary

The authors explore how to compress signals in a worst-case scenario while keeping the total energy limited, allowing different parts of the signal to have different levels of detail. They translate this compression challenge into a mathematical problem about counting points inside a special kind of stretched sphere called a diagonal ellipsoid. For signals with balanced precision settings across coordinates, the authors provide clear upper limits on how many code choices are needed, which depend on the signal's dimension. Their work improves on classical mathematical estimates by using newer techniques involving special functions and summation methods.

signal compressionworst-case analysisell^2 energy constraintquantization precisionlattice pointsdiagonal ellipsoidcodebook sizeBessel functionsAbel summationLandau estimates
Authors
A. Iosevich, A. Vagharshakyan, E. Wyman
Abstract
We study worst-case signal compression under an $\ell^2$ energy constraint, with coordinate-dependent quantization precisions. The compression problem is reduced to counting lattice points in a diagonal ellipsoid. Under balanced precision profiles, we obtain explicit, dimension-dependent upper bounds on the logarithmic codebook size. The analysis refines Landau's classical lattice point estimates using uniform Bessel bounds due to Olenko and explicit Abel summation.