SOLA-GCL: Subgraph-Oriented Learnable Augmentation Method for Graph Contrastive Learning
Authors: Tianhao Peng, Xuhong Li, Haitao Yuan, Yuchen Li, Haoyi Xiong
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments have been conducted on various graph learning applications, ranging from social networks to molecules, under semi-supervised learning, unsupervised learning, and transfer learning settings to demonstrate the superiority of our proposed approach. ... Extensive experiments are conducted on a variety of graph classification datasets with semi-supervised, unsupervised, and transfer learning settings, showcasing the robustness and effectiveness of our SOLA-GCL framework on graph classification tasks. |
| Researcher Affiliation | Collaboration | 1Beihang University 2Baidu Inc. 3Nanyang Technological University 4Shanghai Jiao Tong University |
| Pseudocode | No | The paper describes the proposed method using mathematical formulations and descriptive text for each component, such as the subgraph augmentation selector and subgraph view generator (Equations 2-10). However, it does not include a dedicated section or figure explicitly labeled as 'Pseudocode' or 'Algorithm' with structured steps. |
| Open Source Code | No | The paper does not contain an explicit statement about releasing the source code for SOLA-GCL, nor does it provide a link to a code repository in the main text or references. It mentions RDKit as a software suite, but this is a third-party tool, not the authors' implementation. |
| Open Datasets | Yes | Extensive experiments are conducted on a variety of graph classification datasets with semi-supervised, unsupervised, and transfer learning settings... We trained a view generator on the MUTAG dataset(Debnath et al. 1991; Kriege and Mutzel 2012)... For semi-supervised learning, we perform the semi-supervised graph classification exper-iments on TUDataset (Morris et al. 2020). |
| Dataset Splits | No | The paper mentions using '10% Data' in Table 3 for semi-supervised learning, indicating the proportion of labeled data. However, it does not provide specific percentages or counts for training, validation, and test splits for the overall datasets used in experiments. It refers to 'various datasets' without detailing their splits. |
| Hardware Specification | No | The paper does not specify any particular hardware (e.g., GPU models, CPU types, or memory) used for conducting the experiments. |
| Software Dependencies | No | The paper mentions using GIN and Res GCN as backbone GNNs and RDKit for molecular graphs, but it does not specify version numbers for these or any other software libraries or programming languages used. |
| Experiment Setup | No | The paper specifies the GNN backbone models (GIN and Res GCN) and mentions 'L' layers for the GNN and 'τ' for the temperature parameter in the contrastive loss. However, it does not provide concrete hyperparameter values such as learning rates, batch sizes, number of epochs, or details about the specific optimizers used for training SOLA-GCL. |