Recursive Estimation of Conditional Kernel Mean Embeddings

Authors: Ambrus Tamás, Balázs Csanád Csáji

JMLR 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our main contribution is a weakly and strongly consistent recursive algorithm for estimating the conditional kernel mean map under general structural assumptions. We present and prove a generalized version of Stone s theorem (Stone, 1977), motivated by the recursive form in (Gy orfiet al., 2002, Theorem 25.1). Our generalization is twofold. We deal with general locally compact Polish input spaces and allow the output space to be Hilbertian. ... In Section 4 we apply our results for three arch-typical cases. We deduce universal consistency for Euclidean spaces, Riemannian manifolds and locally compact subsets of Hilbert spaces.
Researcher Affiliation Academia 1Institute for Computer Science and Control (SZTAKI) Hungarian Research Network (HUN-REN), Kende utca 13-17, Budapest, H-1111, Hungary 2Department of Probability Theory and Statistics Institute of Mathematics, E otv os Lor and University (ELTE) P azm any P eter S et any 1 / C, Budapest, H-1117, Hungary
Pseudocode Yes Now, we introduce a recursive local averaging estimator. Let (kn)n N be a smoother sequence and (an)n N be a stepsize (learning rate or gain) sequence of positive numbers. Let us consider the following recursive algorithm: µ1(x) .= ℓ( , Y1), and µn+1(x) .= 1 an+1kn+1(x, Xn+1) µn(x) + an+1kn+1(x, Xn+1)ℓ( , Yn+1), (43) for n 1.
Open Source Code No The paper does not contain any explicit statements about releasing code, links to repositories, or mentions of code being included in supplementary materials.
Open Datasets No The paper does not conduct experiments on specific datasets. It discusses theoretical applications to 'Euclidean spaces, Riemannian manifolds and locally compact subsets of function spaces' but does not use or provide access information for any open datasets for empirical evaluation.
Dataset Splits No The paper is theoretical and does not conduct experiments on specific datasets, therefore, no dataset splits are provided.
Hardware Specification No The paper is theoretical and does not conduct experiments, so no hardware specifications for running experiments are mentioned.
Software Dependencies No The paper is theoretical and does not conduct experiments, so no specific software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and focuses on algorithm development and consistency proofs, rather than empirical evaluation. Therefore, no experimental setup details, hyperparameters, or training configurations are provided.