Local Node Differential Privacy
2026-02-17 • Data Structures and Algorithms
Data Structures and AlgorithmsCryptography and Security
AI summaryⓘ
The authors study how to keep individual nodes (people) in a graph private when each node only shares limited, randomized information about its own connections. They propose new algorithms that let an untrusted server estimate overall graph properties, like degree distributions, almost as accurately as if the server was fully trusted. They also prove that their methods are as good as possible, even if nodes and the server interact multiple times. Additionally, the authors show that privacy for graph data works differently than for regular tabular data in this local privacy setting.
node differential privacylocal modelgraph degree distributionrandomized algorithmsprivacy lower boundsinteractive protocolslocal randomizergraph statisticscentral model privacy
Authors
Sofya Raskhodnikova, Adam Smith, Connor Wagaman, Anatoly Zavyalov
Abstract
We initiate an investigation of node differential privacy for graphs in the local model of private data analysis. In our model, dubbed LNDP, each node sees its own edge list and releases the output of a local randomizer on this input. These outputs are aggregated by an untrusted server to obtain a final output. We develop a novel algorithmic framework for this setting that allows us to accurately answer arbitrary linear queries on a blurry approximation of the input graph's degree distribution. For some natural problems, the resulting algorithms match the accuracy achievable with node privacy in the central model, where data are held and processed by a trusted server. We also prove lower bounds on the error required by LNDP that imply the optimality of our algorithms for several fundamental graph statistics. We then lift these lower bounds to the interactive LNDP setting, demonstrating the optimality of our algorithms even when constantly many rounds of interaction are permitted. Obtaining our lower bounds requires new approaches, since those developed for the usual local model do not apply to the inherently overlapping inputs that arise from graphs. Finally, we prove structural results that reveal qualitative differences between local node privacy and the standard local model for tabular data.