Implicit Riemannian Optimism with Applications to Min-Max Problems

Authors: Christophe Roux, David Martı́nez-Rubio, Sebastian Pokutta

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate our theoretical results by running experiments on the robust Karcher mean problem in the symmetric positive definite manifold and the hyperbolic space, see Appendix E. ... We run RIODAPRGD on (60) with a fixed number of 3 PRGD steps per subroutine... For the experiments, RIODAPRGD is run for 1k iterations... The following two figures show the convergence behavior of RIODAPRGD in terms of the duality gap for experiments run in both the hyperbolic space H5000 and the SPD manifold S100 + , each with n = 50 points.
Researcher Affiliation Academia 1Zuse Institute Berlin, Germany 2Technische Universit at Berlin, Germany 3Carlos III University of Madrid, Spain. Correspondence to: Christophe Roux <EMAIL>, David Mart ınez-Rubio <martinez-rubio.zib.de>.
Pseudocode Yes Algorithm 1 Riemannian Implicit Optimistic Online Gradient Descent (RIOD) ... Algorithm 2 Riemannian Implicit Optimistic Gradient Descent-Ascent (RIODA)
Open Source Code No The paper does not provide concrete access to source code for the methodology described. It mentions: "We implement the experiments using the Pymanopt Library (Townsend et al., 2016)", which refers to a third-party library used, not the authors' own implementation code.
Open Datasets No The paper does not provide concrete access information for a publicly available or open dataset. Instead, it describes how data instances were generated: "First, we generate a random point y on the manifold using the manifold.random_point() function. Then, we generate the centers yi = Log y(vi/ vi y) based on sampling random tangent vectors vi T y M using the manifold.random_tangent_vector() function."
Dataset Splits No The paper describes how instances for the robust Karcher mean problem are generated, but it does not specify any training, testing, or validation dataset splits. The relevant text is: "First, we generate a random point y on the manifold using the manifold.random_point() function. Then, we generate the centers yi = Log y(vi/ vi y) based on sampling random tangent vectors vi T y M using the manifold.random_tangent_vector() function."
Hardware Specification No The paper does not provide specific hardware details. It mentions the environment for experiments as "hyperbolic space H5000 and the SPD manifold S100 + , each with n = 50 points", which refers to problem parameters rather than computational hardware.
Software Dependencies No The paper mentions: "We implement the experiments using the Pymanopt Library (Townsend et al., 2016)". While Pymanopt is a specific software library, its version number is not specified, nor are any other software dependencies with version numbers.
Experiment Setup Yes We run RIODAPRGD on (60) with a fixed number of 3 PRGD steps per subroutine, which means that each iteration of RIODAPRGD require 12 PRGD steps. Setting R = 0.01 and γ = ζ D ensures that the problem is strongly g-concave in (y1, . . . , yn) as our bound on D is loose. For the experiments, RIODAPRGD is run for 1k iterations, corresponding to 12k gradient oracle calls. The step size λ = {10 1, 10 2, 10 3} of PRGD and the proximal parameter η {10 1, 10 2} are optimized to find the best hyperparameters via a grid search.