Random geometric graphs with smooth kernels: sharp detection threshold and a spectral conjecture
2026-02-16 • Information Theory
Information TheorySocial and Information Networks
AI summaryⓘ
The authors study a type of random graph where points are placed on a sphere and edges are formed based on a kernel function of their inner products. They focus on finding the highest dimension in which these graphs can be distinguished from random Erdős–Rényi graphs with the same edge probability. For smooth kernels and dense graphs, they show this critical dimension scales like n to the 3/4 power, which is much smaller than previously known for step-function kernels. They also analyze the problem using the posterior distribution of latent points, providing new insights into detection thresholds and estimation limits tied to dimension and graph size.
random geometric graphErdős–Rényi graphlatent pointskernel functiondetection thresholdposterior distributionsignal-to-noise ratiounit spheregraph estimationKullback-Leibler divergence
Authors
Cheng Mao, Yihong Wu, Jiaming Xu
Abstract
A random geometric graph (RGG) with kernel $K$ is constructed by first sampling latent points $x_1,\ldots,x_n$ independently and uniformly from the $d$-dimensional unit sphere, then connecting each pair $(i,j)$ with probability $K(\langle x_i,x_j\rangle)$. We study the sharp detection threshold, namely the highest dimension at which an RGG can be distinguished from its Erdős--Rényi counterpart with the same edge density. For dense graphs, we show that for smooth kernels the critical scaling is $d = n^{3/4}$, substantially lower than the threshold $d = n^3$ known for the hard RGG with step-function kernels \cite{bubeck2016testing}. We further extend our results to kernels whose signal-to-noise ratio scales with $n$, and formulate a unifying conjecture that the critical dimension is determined by $n^3 \mathop{\rm tr}^2(κ^3) = 1$, where $κ$ is the standardized kernel operator on the sphere. Departing from the prevailing approach of bounding the Kullback-Leibler divergence by successively exposing latent points, which breaks down in the sublinear regime of $d=o(n)$, our key technical contribution is a careful analysis of the posterior distribution of the latent points given the observed graph, in particular, the overlap between two independent posterior samples. As a by-product, we establish that $d=\sqrt{n}$ is the critical dimension for non-trivial estimation of the latent vectors up to a global rotation.