GBGC: Efficient and Adaptive Graph Coarsening via Granular-ball Computing
Authors: Shuyin Xia, Guan Wang, Gaojie Xu, Sen Zhao, Guoyin Wang
IJCAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we design experiments to evaluate our GBGC and answer the following research questions (RQs): RQ1: Does our method achieve better performance compared to the state-of-the-art methods? RQ2: How does GBGC perform with respect to time consume? RQ3: What impact do the key components have on GBGC s performance? RQ4: How do hyperparameters effect GBGC s performance? RQ5: How does GBGC perform in visualization? |
| Researcher Affiliation | Academia | Shuyin Xia1, Guan Wang1, Gaojie Xu1, Sen Zhao2 and Guoyin Wang3 1 Chongqing Key Laboratory of Computational Intelligence, Key Laboratory of Big Data Intelligent Computing, Key Laboratory of Cyberspace Big Data Intelligent Security, Ministry of Education, School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing, China 2 Chongqing Key Laboratory of Computational Intelligence, Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, Chongqing, China 3 National Center for Applied Mathematics in Chongqing, Chongqing Normal University, Chongqing 401331, China |
| Pseudocode | No | The paper mentions 'a non-adaptive GBGC (see Algorithm 4 in Appendix for algorithm design)' in Section 4.5, but no pseudocode or algorithm block is present in the main body of the paper. |
| Open Source Code | Yes | 1Code is available from https://anonymous.4open.science/r/GBGC. Supplementary materials are available from https://github.com/ Wangwangguanguan/Supplementary-materials.git. |
| Open Datasets | Yes | Datasets: We use several standard graph classification datasets, which are shown in Table 1. ... MUTAG [Debnath et al., 1991] ... PROTEINS [Borgwardt et al., 2005] ... IMDB-BINARY [Cai and Wang, 2018] ... NCI109 [Morris et al., 2020b] ... DHFR [Sutherland et al., 2003] ... BZR [Vincent-Cuaz et al., 2021] ... Tox21 AR-LBDtesting [Cooper and Sch urer, 2019] ... OVCAR-8H [Morris et al., 2020a] ... P388H [Morris et al., 2020a] ... SF-295H [Morris et al., 2020a] ... DD [Morris et al., 2020a] |
| Dataset Splits | Yes | To evaluate comparison models thoroughly, we used multiple random seeds and applied cross-validation with repeated runs. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running the experiments. The experimental setup focuses on datasets, baselines, and evaluation metrics. |
| Software Dependencies | No | The paper does not provide specific software dependency details (e.g., library or solver names with version numbers) needed to replicate the experiment. |
| Experiment Setup | No | The paper mentions 'How do hyperparameters effect GBGC s performance?' and 'the identical coarsening parameter ra, as determined by the adaptive coarsening process of GBGC, is utilized across the VNGC, VEGC, MGC, SGC, and KGC.' It also states 'To evaluate comparison models thoroughly, we used multiple random seeds and applied cross-validation with repeated runs.' However, it does not provide specific values for other hyperparameters (e.g., learning rate, batch size, optimizer settings) used in the experiments. |