Targeted Separation and Convergence with Kernel Discrepancies

Authors: Alessandro Barp, Carl-Johann Simon-Gabriel, Mark Girolami, Lester Mackey

JMLR 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this article we derive new sufficient and necessary conditions to ensure (i) and (ii). For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels and for controlling convergence with bounded kernels. We use these results on Rd to substantially broaden the known conditions for KSD separation and convergence control and to develop the first KSDs known to exactly metrize weak convergence to P. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.
Researcher Affiliation Collaboration Alessandro Barp EMAIL University College London & The Alan Turing Institute, GB; Carl-Johann Simon-Gabriel EMAIL Mirelo AI; Mark Girolami EMAIL University of Cambridge & The Alan Turing Institute, GB; Lester Mackey EMAIL Microsoft Research, New England, US
Pseudocode No The paper contains mathematical definitions, theorems, lemmas, and proofs, but no structured pseudocode or algorithm blocks are present.
Open Source Code No The paper does not contain any explicit statement about releasing source code, nor does it provide a link to a code repository or mention code in supplementary materials.
Open Datasets No This theoretical paper focuses on mathematical conditions and properties of kernel discrepancies and does not describe experiments using specific datasets, thus no dataset access information is provided.
Dataset Splits No This theoretical paper does not perform empirical evaluations on datasets, therefore, no training/test/validation dataset splits are mentioned.
Hardware Specification No This theoretical paper does not describe any experimental setup or empirical evaluations, thus no hardware specifications are mentioned.
Software Dependencies No This theoretical paper focuses on mathematical derivations and does not describe any implementation of algorithms or experimental procedures, therefore no software dependencies with specific version numbers are provided.
Experiment Setup No This theoretical paper does not describe any experimental setup, hyperparameters, or system-level training settings.