Online Optimization over Riemannian Manifolds

Authors: Xi Wang, Zhipeng Tu, Yiguang Hong, Yingyi Wu, Guodong Shi

JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In addition, numerical studies on problems defined on symmetric positive definite matrix manifold, hyperbolic spaces, and Grassmann manifolds are provided to validate our theoretical findings, using synthetic and real-world data.
Researcher Affiliation Academia Xi Wang EMAIL Zhipeng Tu EMAIL Academy of Mathematics and Systems Science Chinese Academy of Sciences Beijing 100190, P. R. China, and Australian Center for Robotics School of Aerospace, Mechanical and Mechatronic Engineering The University of Sydney NSW 2006, Australia Yiguang Hong EMAIL Shanghai Research Institute for Intelligent Autonomous Systems Tongji University Shanghai 201210, P. R. China Yingyi Wu EMAIL Department of Mathematics University of Chinese Academy of Sciences Beijing 100040, P. R. China Guodong Shi EMAIL Australian Center for Robotics School of Aerospace, Mechanical and Mechatronic Engineering The University of Sydney NSW 2006, Australia
Pseudocode Yes Algorithm 1: Riemannian Online Gradient Descent Algorithm (R-OGD) Input: Manifold M, time T, step sizes (or schedule) {αt} Output: {xt}t=1,...,T for t = 1 to T do Play xt and observe the function ft; Update xt+1 with ( xt+1 = expxt( αt ft(xt)) xt+1 = PK( xt+1), where PK is the Riemannian projection mapping of x onto K, i.e., PK(x) := arg min y K d(x, y); Return xt+1, and suffer the loss ft(xt); end
Open Source Code Yes For reproduction of the results, all the source codes are accessible online1. 1. https://github.com/Riemannian OCO/experiments
Open Datasets Yes We examine Algorithms 1, 2 and 3 and the R-OZO on three real-world data sets from the openml database2, including iris, eeg-eye-state and waveform-5000. 2. https://www.openml.org/
Dataset Splits No The paper describes an online PCA problem where data arrives in batches for 'N' data points over 'T' learning rounds. It provides the total number of samples ('T' for iris, eeg-eye-state, waveform-5000) but does not specify how these datasets are partitioned into explicit training, validation, or test splits in a conventional sense for reproducibility. For instance, for Iris, it states T=150 learning rounds, implying sequential processing rather than a fixed split.
Hardware Specification Yes The code is built with the help of the Pymanopt package (Townsend et al., 2016) and all experiments are performed in Python 3.8 on a 3.4 GHz AMD Ryzen5 machine with 16GB RAM.
Software Dependencies No The paper mentions 'Python 3.8' and 'Pymanopt package'. While Python 3.8 has a specific version, Pymanopt is mentioned without a version number, which is a key software component.
Experiment Setup Yes We consider the online Fr echet mean problem where [n, N, T] = [100, 10, 10000]... We examine Algorithms 1, 2 and 3 for strongly g-convex cases with µ = 1... We take C0 = 2125 for the R-BAN algorithm and C0 = 170 for the R-2-BAN algorithm... In the experiment, we set δ = 7 3q n2C2(1+log T) T ... we opted to use τ = δ r in our experiment. For the R-OZO, we take σ = 0.001... Table 2: Descriptions and settings of testing data sets for online PCA ... experimental parameters iris ... [n, N, T, d, µ, θ] = [4, 1, 150, 2, 2, 1]