Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]
Riemannian Bilevel Optimization
Authors: Jiaxiang Li, Shiqian Ma
JMLR 2025 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments on robust optimization on Riemannian manifolds are presented to show the applicability and efficiency of the proposed methods. |
| Researcher Affiliation | Academia | Jiaxiang Li EMAIL Department of Electrical and Computer Engineering University of Minnesota, Twin Cities Minneapolis, MN 55455, USA Shiqian Ma EMAIL Department of Computational Applied Math and Operations Research Rice University Houston, TX 77005, USA |
| Pseudocode | Yes | Algorithm 1: Algorithm for Riemannian (deterministic) Bilevel Optimization (Rie BO) input : K, T, N(steps for conjugate gradient), stepsize {αk, βk}, initializations x0 M, y0 N for k = 0, 1, 2, ..., K 1 do |
| Open Source Code | Yes | Our code is publicly available at https://github.com/Jason Jiaxiang Li/Manifold_bilevel. |
| Open Datasets | Yes | Following Li et al. (2020a); Han et al. (2024), we consider 5-ways 5-shots meta learning over the Mini Image Net dataset with four-layer CNN and with the kernels setting to be on the Grassmannian manifold, and test the accuracy over 200 tasks. |
| Dataset Splits | Yes | Following Li et al. (2020a); Han et al. (2024), we consider 5-ways 5-shots meta learning over the Mini Image Net dataset with four-layer CNN and with the kernels setting to be on the Grassmannian manifold, and test the accuracy over 200 tasks. |
| Hardware Specification | No | No specific hardware details (like GPU/CPU models, cloud instances) were mentioned in the paper. |
| Software Dependencies | No | No specific software versions (e.g., Python 3.8, PyTorch 1.9) are provided in the paper. |
| Experiment Setup | Yes | The algorithm is terminated with K = 200 rounds of outer iterations, and the inner iteration is also taken to be T = 200 (the value which we observe a good inner iteration convergence). We take αk = 10 2 and βk = 10 1. |