Sobolev Norm Learning Rates for Regularized Least-Squares Algorithms
Authors: Simon Fischer, Ingo Steinwart
JMLR 2020 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Learning rates for least-squares regression are typically expressed in terms of L2-norms. In this paper we extend these rates to norms stronger than the L2-norm without requiring the regression function to be contained in the hypothesis space. ... Finally, we prove the asymptotic optimality of our results in many cases. Keywords: statistical learning theory, regularized kernel methods, least-squares regression, interpolation norms, uniform convergence, learning rates |
| Researcher Affiliation | Academia | Simon Fischer EMAIL Ingo Steinwart EMAIL Institute for Stochastics and Applications Faculty 8: Mathematics and Physics University of Stuttgart 70569 Stuttgart Germany |
| Pseudocode | No | The paper describes mathematical proofs and theorems without providing any structured pseudocode or algorithm blocks. It primarily focuses on theoretical derivations. |
| Open Source Code | No | The paper does not contain any explicit statements about releasing source code, nor does it provide links to a code repository. The arXiv reference is for a previous version of the paper itself, not for source code. |
| Open Datasets | No | The paper discusses theoretical concepts related to learning rates for regularized least-squares algorithms and does not perform experiments on any specific public or open datasets. The mention of 'data set D' is a theoretical concept in the introduction. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments with datasets, therefore it does not discuss dataset splits. |
| Hardware Specification | No | This paper presents theoretical results and mathematical proofs; it does not describe any experimental setup or specify hardware used for computations. |
| Software Dependencies | No | The paper is theoretical and does not describe any implementation details or software dependencies with specific version numbers. |
| Experiment Setup | No | The paper focuses on theoretical derivations and proofs of learning rates; it does not include an experimental section with details on hyperparameter values or system-level training settings. |