GNNs Getting ComFy: Community and Feature Similarity Guided Rewiring

Authors: Celia Rubio-Madrigal, Adarsh Jamadandi, Rebekka Burkholz

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments confirm the effectiveness of these strategies and support our theoretical insights. ... We conduct a comprehensive set of experiments for all proposed algorithms on various benchmark datasets. ... Table 1: Accuracy on node classification comparing different rewiring schemes. ... Table 4 reports the computational efficiency compared to baselines, in seconds, when adding or deleting 50 edges.
Researcher Affiliation Academia Celia Rubio-Madrigal 1 Adarsh Jamadandi 1,2 Rebekka Burkholz1 1 CISPA Helmholtz Center for Information Security 2 Universit at des Saarlandes Equal contribution. Corresponding email: EMAIL
Pseudocode Yes Algorithm 1 Proxy Spectral Gap based Greedy Graph Addition (PROXYADDMIN) ... Algorithm 2 Proxy Spectral Gap based Greedy Graph Sparsification (PROXYDELMIN) ... Algorithm 3 Higher Com Ma: Increasing community structure. ... Algorithm 4 Lower Com Ma: Decreasing community structure. ... Algorithm 5 Fea St: Maximizing feature similarity. ... Algorithm 6 Com Fy: Maximizing feature similarity across communities.
Open Source Code Yes Our code is available here: https://github.com/Relational ML/Com Fy.
Open Datasets Yes We test our algorithms on a variety of homophilic and heterophilic graphs: Cora (Mc Callum et al., 2000), Citeseer (Sen et al., 2008), Pubmed (Namata et al., 2012), Cornell, Texas, Wisconsin, Chameleon, Squirrel, and Actor (Platonov et al., 2023c). ... We present results on CS, Physics, and Photo (Shchur et al., 2019) available as Py Torch geometric datasets.
Dataset Splits Yes For datasets Cora, Citeseer, Pubmed, Cornell, Texas, Wisconsin, Chameleon, Squirrel and Actor we use a 60/20/20 split for train/test/validation respectively. ... For datasets Roman-empire, Amazon-ratings and Minesweeper we use the code base of the authors Platonov et al. (2023c), where the datasets are split 50/25/25 for train/test/validation respectively.
Hardware Specification No The authors gratefully acknowledge the Gauss Centre for Supercomputing e.V. for funding this project by providing computing time on the GCS Supercomputer JUWELS at Jülich Supercomputing Centre (JSC).
Software Dependencies No We use Py Torch Geometric (Fey & Lenssen, 2019) and Deep Graph Library (DGL) (Wang et al., 2019) for all our experiments.
Experiment Setup Yes The hyperparameters are tuned on the validation set. Our backbone model is a 2-layered GCN (Kipf & Welling, 2017). ... Our backbone model here is a 5-layered GCN ... We report the hyperparameters such as the Normalized Mutual Information (NMI) between the cluster labels and the ground truth labels after community detection (Blondel et al., 2008) before and after rewiring the graph to understand how it affects the community structure-node label alignment. ... The hyperparameters used in the experiments are given in Table 15. Table 15: GCN hyperparameters used in the experiments. Dataset LR Dropout Hidden Dimension Cora 0.01 0.41 128 ...