Subgraph Federated Learning for Local Generalization
Authors: Sungwon Kim, Yoonho Lee, Yunhak Oh, Namkyeong Lee, Sukwon Yun, Junseok Lee, Sein Kim, Carl Yang, Chanyoung Park
ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our model outperforms baselines in our proposed experimental settings, which are designed to measure generalization power to unseen data in practical scenarios. Our code is available at https://github.com/sung-won-kim/Fed Lo G |
| Researcher Affiliation | Academia | 1KAIST, 2UNC Chapel Hill, 3Emory University |
| Pseudocode | Yes | Algorithm 1 Fed Lo G: The Overall Algorithm |
| Open Source Code | Yes | Our code is available at https://github.com/sung-won-kim/Fed Lo G |
| Open Datasets | Yes | We conduct experiments on five real-world graph datasets. ... The datasets used are Cora (Sen et al., 2008), Cite Seer (Sen et al., 2008), Pub Med (Sen et al., 2008), Amazon Computer (Mc Auley et al., 2015), and Amazon Photo (Shchur et al., 2018). |
| Dataset Splits | Yes | Table 12: Cora Dataset Statistics (Closed Set) Dataset Class Train Valid Test Test Test |
| Hardware Specification | Yes | All experiments are conducted using four 24GB NVIDIA Ge Force RTX 4090 GPUs. |
| Software Dependencies | Yes | Our method is implemented on Python 3.10, Py Torch 2.0.1, and Torch-geometric 2.4.0. |
| Experiment Setup | Yes | In our experiments, we use a 2-layer Graph SAGE (Hamilton et al., 2017) implementation (φE) with a dropout rate of 0.5, a hidden dimension of 128, and an output dimension of 64. The model parameters with learnable features XVk,head and XVk,tail are optimized with Adam (Kingma & Ba, 2014) using a learning rate of 0.001. ... For all experiments, we set the number of rounds (R) to 100 and the number of local epochs to 1. ... We set the number of learnable nodes s to 20, the tail-degree threshold γ to 3, and select the regularization parameter β to values in the range of [0.01, 0.1, 1]. |