Spectro-Riemannian Graph Neural Networks

Authors: Karish Grover, Haiyang Yu, Xiang song, Qi Zhu, Han Xie, Vassilis Ioannidis, Christos Faloutsos

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical evaluation across eight homophilic and heterophilic datasets demonstrates the superiority of CUSP in node classification and link prediction tasks, with a gain of up to 5.3% over state-of-the-art models.
Researcher Affiliation Collaboration 1Carnegie Mellon University, 2Texas A&M University, 3Amazon EMAIL, {haiyang}@tamu.edu, EMAIL
Pseudocode Yes Algorithm 1 Product manifold signature estimation and curvature initialisation
Open Source Code Yes The code is available at: https://github.com/amazon-science/cusp.
Open Datasets Yes We evaluate CUSP on the Node Classification (NC) and Link Prediction (LP) tasks using eight benchmark datasets. These include (a) Homophilic datasets such as (i) Citation networks Cora, Citeseer and Pub Med (Sen et al., 2008; Yang et al., 2016), and (b) Heterophilic datasets, which comprise (i) Wikipedia graphs Chameleon and Squirrel (Rozemberczki et al., 2021), (ii) Actor co-occurrence network (Tang et al., 2009), and (iii) Webpage graphs from Web KB 2 Texas and Cornell.
Dataset Splits Yes For the transductive LP task, we randomly split edges into 85%/5%/10% for training, validation and test sets, while for transductive NC task, we use the 60%/20%/20% split.
Hardware Specification No The paper does not explicitly mention specific hardware details like GPU/CPU models, processor types, or memory amounts used for running its experiments. The 'MORE EXPERIMENTAL SETTINGS' section (Appendix 7.6.5) primarily focuses on hyperparameters.
Software Dependencies No The paper mentions using the 'Geoopt library' but does not specify a version number. It also refers to algorithms like 'Sinkhorn algorithm' and 'Hungarian algorithm' but not specific software libraries with version numbers.
Experiment Setup Yes For all experiments, we choose the total manifold dimension as d M = 48 and learning rate as 4e-3. We use the filter bank ΩPd M = ZI Pd M, Z(1) Pd M, Z(2) Pd M, . . . , Z(L) Pd M , with L = 10. For the GPR weights, we experiment with different initializations, α {0.1, 0.3, 0.5, 0.9}. We list the hyperparameter settings in Appendix 7.6.5.