Decoupled Subgraph Federated Learning

Authors: Javad Aliakbari, Johan Östman, Alexandre Graell i Amat

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate the effectiveness of FEDSTRUCT through experimental results conducted on six datasets for semi-supervised node classification, showcasing performance close to the centralized approach across various scenarios, including different data partitioning methods, varying levels of label availability, and number of clients.
Researcher Affiliation Collaboration Javad Aliakbari1 Johan Ostman2 Alexandre Graell i Amat1 1Chalmers University of Technology 2AI Sweden
Pseudocode Yes The FEDSTRUCT framework is illustrated in Figure 2 and described in Alg. 1 in App. B. (...) Algorithm 1 FEDSTRUCT (...) Algorithm 2 FEDSTRUCT using HOP2VEC (...) Algorithm 3 Private acquisition of A[i]
Open Source Code Yes The source code is publicly available in the Github Link.
Open Datasets Yes The datasets considered are: Cora (Sen et al., 2008), Citeseer (Sen et al., 2008), Pubmed (Namata et al., 2012), Chameleon (Pei et al., 2020), Amazon Photo (Shchur et al., 2018), and Amazon Ratings (Platonov et al., 2023).
Dataset Splits Yes We focus on a strongly semi-supervised setting where data is split into training, validation, and test sets containing 10%, 10%, and 80% of the nodes, respectively.
Hardware Specification Yes All the experiments are obtained using an Nvidia A30 with 24GB of memory.
Software Dependencies No The paper mentions using GNNs like GRAPHSAGE (Hamilton et al., 2017) but does not provide specific version numbers for any software dependencies (e.g., programming languages, libraries, frameworks). There is no explicit list of software or their versions.
Experiment Setup Yes In Table 5, we provide the step sizes λ and λs for the gradient descent step during the training, the weight decay in the L2 regularization, the number of training iterations (epochs), the number of layers L in the node feature embedding, the number of layers Ls in the DECOUPLED GCN, the dimensionality of the NSFs, ds, the pruning parameter p, and the model architecture of the node feature and node structure feature predictors, fθf and gθs, respectively.