Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps

Authors: Simon Apers, Sander Gribling, Dániel Szilágyi

JMLR 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We show that Metropolis-adjusted HMC can sample from a distribution that is ε-close to a Gaussian in total variation distance using e O( κd1/4 log(1/ε)) gradient queries... This contrasts with (and was motivated by) recent results that give an eΩ(κd1/2) query lower bound for HMC... Our work fits within the recent effort of proving non-asymptotic (and often tight) bounds on Markov chain algorithms for constrained distributions such as Gaussian distributions and, more generally, logconcave distributions.
Researcher Affiliation Academia Simon Apers EMAIL IRIF Universit e Paris Cit e, CNRS Paris, F-75013, France Sander Gribling EMAIL Department of Econometrics and Operations Research Tilburg University Tilburg, 5000 LE, the Netherlands D aniel Szil agyi EMAIL IRIF Universit e Paris Cit e Paris, F-75013, France
Pseudocode Yes Algorithm 1: Markov kernel P (idealized HMC with random integration time) Algorithm 2: Markov kernel ˆQ (leapfrog HMC with random integration time) Algorithm 3: Markov kernel Q (Adjusted leapfrog HMC with random integration time)
Open Source Code No The paper describes algorithms and theoretical bounds but does not contain any explicit statements about releasing source code or links to repositories.
Open Datasets No The paper focuses on sampling from high-dimensional Gaussian distributions, which are mathematical constructs, not physical datasets that would require public access information. It does not perform experiments on specific datasets.
Dataset Splits No The paper is theoretical, focusing on mathematical bounds and algorithm analysis for sampling from Gaussian distributions. It does not involve empirical experiments with datasets that would require training, testing, or validation splits.
Hardware Specification No The paper is primarily theoretical, focusing on mathematical analysis and algorithm complexity. It does not describe any experimental setup that would require specific hardware specifications.
Software Dependencies No The paper describes algorithms (Hamiltonian Monte Carlo, Metropolis-Hastings) and their theoretical properties. It does not mention any specific software implementations or dependencies with version numbers.
Experiment Setup No The paper is theoretical, presenting mathematical proofs and analyses of algorithm complexity and mixing times for Hamiltonian Monte Carlo. It does not describe any empirical experiments, and therefore no experimental setup details like hyperparameters or training settings are provided.