FedDAG: Federated DAG Structure Learning
Authors: Erdun Gao, Junjia Chen, Li Shen, Tongliang Liu, Mingming Gong, Howard Bondell
TMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on both synthetic and real-world datasets verify the efficacy of the proposed method. |
| Researcher Affiliation | Collaboration | Erdun Gao EMAIL School of Mathematics and Statistics, The University of Melbourne; Junjia Chen EMAIL Faculty of Electronic and Information Engineering, Xi an Jiaotong University; Li Shen EMAIL JD Explore Academy; Tongliang Liu EMAIL TML Lab, Sydney AI Centre, The University of Sydney Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence; Mingming Gong EMAIL School of Mathematics and Statistics, The University of Melbourne; Howard Bondell EMAIL School of Mathematics and Statistics, The University of Melbourne |
| Pseudocode | Yes | Algorithm 1 Fed DAG; Algorithm 2 Sub-Problem Solver (SPS) for Fed DAG |
| Open Source Code | No | The paper lists code availability for baseline methods like MCSL, NOTEARS, etc., and states 'Our implementation is highly based on the existing Tool-chain named g Castle (Zhang et al., 2021)', but does not explicitly provide a link or statement for the specific implementation of Fed DAG. |
| Open Datasets | Yes | We consider a real public dataset named f MRI Hippocampus (Poldrack et al., 2015) |
| Dataset Splits | Yes | In the following experiments, we take 10 clients and each with 600 observations (unless otherwise specified in some ablation studies.) throughout this paper.; We consider a real public dataset named f MRI Hippocampus (Poldrack et al., 2015) to discover the underlying relationships among six brain regions. This dataset records signals from six separate brain regions in the resting state of one person in 84 successive days and the anatomical structure provide 7 edges as the ground truth graph... Herein, we separately select 500 records in each of the 10 days, which can be regarded as different local data. |
| Hardware Specification | No | The paper mentions 'This research was undertaken using the LIEF HPC-GPGPU Facility hosted at the University of Melbourne. This Facility was established with the assistance of LIEF Grant LE170100200.' which is a general facility name but does not specify particular GPU/CPU models or other detailed hardware specifications. |
| Software Dependencies | No | The paper mentions 'Our implementation is highly based on the existing Tool-chain named g Castle (Zhang et al., 2021)' and 'We take ADAM (Kingma & Ba, 2015)' but does not provide specific version numbers for these or any other software dependencies. |
| Experiment Setup | Yes | Our GS-Fed DAG and AS-Fed DAG reach this point and are implemented with the following hyper-parameters. We take ADAM (Kingma & Ba, 2015) with learning rate 3e-2 and all the observational data Dck on each client are used for computing the gradient. And the detailed parameters used in Algorithms 1 and 2 are listed in Table 7. Table 8: The combinations of ρinit and β on simulated data in our method. |