Pruning for GNNs: Lower Complexity with Comparable Expressiveness
Authors: Dun Ma, Jianguo Chen, Wenguo Yang, Suixiang Gao, Shengminjie Chen
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct both synthetic and real experiments, which demonstrate the superior performance of our proposed pruned framework and validate our theoretical findings. Experimental results validate our refinements, demonstrating competitive performance across benchmark datasets with improved efficiency. We conduct extensive experiments to evaluate the performance of the pruned frameworks. |
| Researcher Affiliation | Academia | 1 School of Advanced Interdisciplinary Sciences, University of Chinese Academy of Sciences 2 Academy of Mathematics and Systems Science, Chinese Academy of Sciences 3 School of Mathematical Sciences, University of Chinese Academy of Sciences 4 Zhongguancun Laboratory.Beijing, China 5 State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences. Correspondence to: Wenguo Yang <EMAIL>. |
| Pseudocode | Yes | Algorithm 1 The 1-WL Algorithm Algorithm 2 The k-WL Algorithm Algorithm 3 The K-Path WL Algorithm Algorithm 4 The K-Hop WL Algorithm Algorithm 5 The PR WL Algorithm Algorithm 6 The Pruned K-Path WL Algorithm Algorithm 7 The Pruned K-Hop WL Algorithm Algorithm 8 HCGCR: Hashed CGCR(WL Test Algorithm) Algorithm 9 Pseudocode for the Pruned WL Test Algorithm Algorithm 10 Pseudocode for the K-Hop(Path) WL Test Algorithm Algorithm 11 Pseudocode for the Pruned K-Hop(Path) WL Test Algorithm |
| Open Source Code | Yes | And all the experimental materials in provided in https://anonymous.4open. science/r/Pruned GNN-AC61/README.md |
| Open Datasets | Yes | We conduct extensive experiments to evaluate the performance of the pruned frameworks. Q1: Do the pruned frameworks have the same expressive power as the original frameworks? Q2: Does the pruning improve the frameworks performance? To verify the expressive power of the pruned frameworks, we empirically evaluate them on three simulation datasets: (1) EXP (Abboud et al., 2021), (2) SR25 (Balcilar et al., 2021), and (3) CSL, comparing them with their original frameworks (Murphy et al., 2019). We use node properties (such as single-source shortest path, eccentricity, and Laplacian feature) and graph property regression (connectivity, diameter, radius), as well as graph substructure counting (triangle, tailed triangle, star, and 4-cycle) to demonstrate expressive power. To verify whether pruning improves the frameworks performance, we evaluate the pruned frameworks performance on 8 real-world datasets: MUTAG (Debnath et al., 1991), DD (Dobson & Doig, 2003), PROTEINS (Dobson & Doig, 2003), PTC-MR (Toivonen et al., 2003), and IMDB-B (Yanardag & Vishwanathan, 2015) from TU database, as well as QM9 (Ramakrishnan et al., 2014; Wu et al., 2018) and ZINC (Dwivedi et al., 2020) for molecular property prediction. |
| Dataset Splits | No | The paper mentions evaluating models on various datasets and notes 'training time' in Table 11, which implies the use of data splits for training and evaluation. However, it does not explicitly state specific percentages (e.g., 80/10/10), sample counts, or refer to a particular predefined splitting methodology in the main text required for reproduction. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper does not explicitly mention specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9) required to reproduce the experiments. |
| Experiment Setup | Yes | The parameter settings for the number of layers are configured as follows:(1)WL(MP-GNN):3 layers,7 layers and 10 layers.(2)Pruned WL:[1,2]layers,[1,2,4]layers,[1,2,3,4]layers.(3)K-Hop:K=2,3 layers,:K=2,5 layers.(4)Pruned K-Hop: K=2,3 layers,K=2,5 layers.(5)K-Path: K=2,3 layers,K=2,5 layers.(6)Pruned K-Path: K=2,3 layers,K=2,5 layers. |