AutoStep: Locally adaptive involutive MCMC

Authors: Tiange Liu, Nikola Surjanovic, Miguel Biron-Lattes, Alexandre Bouchard-Cote, Trevor Campbell

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results examine the robustness and efficacy of our proposed step size selection procedure, and show that Auto Step MCMC is competitive with stateof-the-art methods in terms of effective sample size per unit cost on a range of challenging target distributions.
Researcher Affiliation Academia 1Department of Statistics, University of British Columbia, Canada.
Pseudocode Yes Algorithm 1 One iteration of Auto Step MCMC Algorithm 2 Step size selector µ Algorithm 3 Round-based Auto Step MCMC
Open Source Code Yes The code for reproducing the main experimental results is available at https: //github.com/Julia-Tempering/Auto Step.
Open Datasets Yes Our benchmarks include two synthetic distributions Neal s funnel in 2 and 100 dimensions and three real Bayesian posteriors: a three-parameter linear regression model for yearly temperatures at Kilpisj arvi, Finland (Bales et al., 2019), an orbit-fitting problem (Thompson et al., 2024), and an m RNA transfection model (Ballnus et al., 2017). The orbital model is a model of the multiple system Gliese 229 available in Octofitter.jl (Thompson et al., 2024) at https://github.com/sefffal/Orbit Posterior DB/blob/main/models/astrom-GL229A.jl.
Dataset Splits No The paper focuses on Markov Chain Monte Carlo (MCMC) methods for sampling from target distributions, rather than supervised machine learning tasks involving explicit training, validation, and test dataset splits. The 'datasets' mentioned (e.g., Kilpisjarvi, mRNA) serve as the target distributions for sampling, not as data to be partitioned for model training and evaluation in the traditional sense.
Hardware Specification No The Acknowledgements section mentions 'use of the ARC Sockeye computing platform from the University of British Columbia'. However, this is a general platform name and does not provide specific hardware details such as GPU/CPU models, processors, or memory specifications.
Software Dependencies No The paper mentions software packages like 'Pigeons.jl' and 'Octofitter.jl' and refers to the code repository 'Julia-Tempering/Auto Step'. However, it does not provide specific version numbers for these tools or for any other key software components, such as programming languages or core libraries, which are necessary for reproducible software dependencies.
Experiment Setup Yes For each of the same three synthetic targets as in Section 5.1, we initialize the state x N(0, 202), θ0 {10 7, . . . , 107}, and run the Markov chain for R = 20 doubling rounds ( 2 106 steps total), tuning θ0 per Algorithm 3 after each round. Adaptive MALA: Step size θ0 = 0.1. Delayed Rejection HMC: Step size θ0 = 0.1, number of subsequent proposals k = 2, and step size divisor a = 5.0. NUTS: Maximum tree depth Lmax = 5 and target acceptance ratio αtarget = 0.65.