GraphBridge: Towards Arbitrary Transfer Learning in GNNs
Authors: Li Ju, Xingyi Yang, Qi Li, Xinchao Wang
ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical validation, conducted over 16 datasets representative of these scenarios, confirms the framework s capacity for taskand domain-agnostic transfer learning within graph-like data, marking a significant advancement in the field of GNNs. |
| Researcher Affiliation | Academia | Li Ju, Xingyi Yang, Qi Li, Xinchao Wang National University of Singapore EMAIL, EMAIL |
| Pseudocode | No | The paper describes methods textually and mathematically (e.g., equations for GSST and GMST), but it does not contain a clearly labeled 'Pseudocode' or 'Algorithm' block. |
| Open Source Code | Yes | Code is available at https://github.com/jujulili888/Graph Bridge. |
| Open Datasets | Yes | The datasets employed in our experiments can be categorized based on task levels: graphlevel tasks consist of ZINC-full, BACE, BBBP, Clin Tox, HIV, SIDER, Tox21, MUV, Tox Cast, which is a series of molecular graph datasets; node-level tasks include ogbn-arxiv, Cora, Cite Seer, Pub Med, Amazon-Computers, Flickr, encompassing node classification datasets related to citation networks, product ranking networks, and social networks; point cloud tasks involve Model Net10, which is a 10-classification point cloud dataset. (Many of these are cited with proper attribution in Appendix A.1). |
| Dataset Splits | Yes | For the task of point cloud classification, we adopt the Model Net10 dataset (Wu et al., 2015)... Specifically, the Model Net10 dataset contains 4899 CAD models of 10 man-made object categories, of which 3991 CAD models (Model Net40: 40 classes-classification dataset with 9,843 CAD models are used for training and 2,468 CAD models are for testing) are used for training and 908 CAD models are for testing. |
| Hardware Specification | No | The paper discusses 'significant training efforts and substantial memory resources' and 'memory-efficient attribute' of the proposed method, but does not provide specific details about the hardware (e.g., GPU/CPU models, memory amounts) used for running the experiments. |
| Software Dependencies | No | The paper mentions various GNN models and pre-training methods but does not provide specific software dependencies (e.g., libraries, frameworks) along with their version numbers that would be necessary to replicate the experiments. |
| Experiment Setup | Yes | In the Graph2Graph task, we employ a five-layer backbone architecture... In the Node2Node, Graph2Node, and Graph2Pt Cld tasks, we consistently utilize a standard graph neural network structure comprising two-layer graph convolutions. For the backbones of the aforementioned model, we configure the hidden layer dimension of the base to be 100, while the hidden layer dimension of the side network is set to 16. |