Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]

Geometric Learning with Positively Decomposable Kernels

Authors: Nathael Da Costa, Cyrus Mostajeran, Juan-Pablo Ortega, Salem Said

JMLR 2024 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This work provides theoretical foundations for kernel methods with non-PD kernels and for their application to non-Euclidean data spaces. We then investigate the conditions under which a kernel is positively decomposable. We show that invariant kernels admit a positive decomposition on homogeneous spaces under tractable regularity assumptions. This makes them much easier to construct than positive definite kernels, providing a route for learning with kernels for non-Euclidean data. By the same token, this provides theoretical foundations for RKKS-based methods in general.
Researcher Affiliation Academia Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological University 21 Nanyang Link, 637371, Singapore; Laboratoire Jean Kuntzman Universit e Grenoble-Alpes Grenoble, 38400, France
Pseudocode No No pseudocode or algorithm blocks are present in the paper.
Open Source Code No The paper does not provide explicit statements or links indicating the release of source code for the described methodology.
Open Datasets No Figure 1: The Krein SVM algorithm (Loosli et al., 2016) applied on the hyperbolic plane H2, with the geodesic Gaussian kernel k = exp( λd( , )2). The data is sampled from a Riemannian Gaussian distribution (Said et al., 2018, 2022, 2023) centered at the origin of the Poincare disc and is split into two classes according to geodesic decision boundaries (dotted curves in the figure).
Dataset Splits No The paper does not describe any dataset splits, as it focuses on theoretical contributions rather than empirical experiments with specific datasets.
Hardware Specification No The paper does not provide any specific hardware details used for running experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers.
Experiment Setup No The paper does not contain specific experimental setup details, hyperparameters, or training configurations.