Posterior Contraction for Deep Gaussian Process Priors

Authors: Gianluca Finocchio, Johannes Schmidt-Hieber

JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We study posterior contraction rates for a class of deep Gaussian process priors in the nonparametric regression setting under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to log n factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametric theory for Gaussian process priors.
Researcher Affiliation Academia Gianluca Finocchio EMAIL Faculty of Business, Economics and Statistics University of Vienna 1090 Vienna, Austria Johannes Schmidt-Hieber EMAIL Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente 7522 NB Enschede, The Netherlands
Pseudocode No The paper describes theoretical constructions in several steps (e.g., 'Step 0. Choice of Gaussian processes.') but these are descriptive text, not formally structured pseudocode or algorithm blocks.
Open Source Code No We view the proposed Bayesian analysis rather as a proof of concept than something that is straightforward implementable or computationally efficient.
Open Datasets No In the multivariate nonparametric regression model with random design distribution µ supported on [ 1, 1]d, we observe n i.i.d. pairs (Xi, Yi) [ 1, 1]d R, i = 1, . . . , n, with Xi µ, Yi = f (Xi) + εi, i = 1, . . . , n (1) and εi independent standard normal random variables that are independent of the design vectors (X1, . . . , Xn). We aim to recover the true regression function f : [ 1, 1]d R from the sample.
Dataset Splits No The paper describes a theoretical model for nonparametric regression with a random design distribution and n i.i.d. pairs. It does not involve specific datasets or their splits for empirical evaluation.
Hardware Specification No The paper is theoretical and does not describe any experiments that would require specific hardware specifications.
Software Dependencies No The paper is theoretical and does not mention specific software or library versions used for implementation.
Experiment Setup No The paper is theoretical and focuses on mathematical results and properties of Deep Gaussian Process priors. There are no empirical experiments described, and therefore no experimental setup details like hyperparameters or training schedules.