Differential Privacy Guarantees of Markov Chain Monte Carlo Algorithms

Authors: Andrea Bertazzi, Tim Johnston, Gareth O. Roberts, Alain Oliviero Durmus

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper aims to provide differential privacy (DP) guarantees for Markov chain Monte Carlo (MCMC) algorithms. In a first part, we establish DP guarantees on samples output by MCMC algorithms as well as Monte Carlo estimators associated with these methods under assumptions on the convergence properties of the underlying Markov chain. In a second part, we specialise our analysis to the unadjusted Langevin algorithm and stochastic gradient Langevin dynamics and establish guarantees on their (Rényi) DP. To this end, we develop a novel methodology based on Girsanov s theorem combined with a perturbation trick to obtain bounds for an unbounded domain and in a non-convex setting.
Researcher Affiliation Academia 1École polytechnique, Institut Polytechnique de Paris, France 2Université Paris Dauphine PSL, France 3University of Warwick, UK.
Pseudocode No The paper describes algorithms like ULA and noisy SGD using mathematical equations, such as equation (12) for the ULA: "x D n+1 = x D n γ UD(x D n ) + p 2γ zn+1". However, these are presented as mathematical definitions within the text, not as structured pseudocode blocks or algorithms with explicit labels like "Algorithm 1" or "Pseudocode".
Open Source Code No The paper does not contain any statements regarding the release of open-source code, nor does it provide links to code repositories in the main text, acknowledgements, or supplementary materials.
Open Datasets No The paper focuses on theoretical analysis and differential privacy guarantees of MCMC algorithms and stochastic differential equations. It does not conduct experiments on specific datasets and therefore does not mention any publicly available or open datasets for empirical evaluation.
Dataset Splits No The paper is theoretical and does not involve empirical experiments using datasets. Consequently, there is no mention or discussion of dataset splits for training, validation, or testing.
Hardware Specification No The paper presents theoretical research, focusing on mathematical proofs and guarantees for MCMC algorithms. It does not describe any experimental setup or report on computational experiments, and therefore no specific hardware specifications (e.g., GPU models, CPU types, memory) are mentioned.
Software Dependencies No This paper is theoretical in nature, providing mathematical proofs and analyses. It does not describe any implementation of algorithms or experimental setups, and thus does not list any software dependencies with specific version numbers.
Experiment Setup No The paper is entirely theoretical, focusing on mathematical proofs and analyses of differential privacy for MCMC algorithms. It does not present any empirical experiments or their configurations, and therefore, no experimental setup details, such as hyperparameters or training settings, are provided.