Differentially Private Clustered Federated Learning
Authors: Saber Malekmohammadi, Afaf Taik, Golnoosh Farnadi
TMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experimental results (codes: here) show effectiveness of the approach in addressing high structured data heterogeneity in DPFL. We extensively evaluate the proposed algorithm across diverse datasets and scenarios and demonstrate the effectiveness of our robust DP clustered FL algorithm in detecting the underlying cluster structure of clients, which leads to an overall utility improvement for the system (Section 6). |
| Researcher Affiliation | Academia | Saber Malekmohammadi EMAIL Mila Quebec AI institute, Montreal, Canada School of Computer Science, University of Waterloo, Waterloo, Canada Afaf Taik Mila Quebec AI institute, Montreal, Canada Golnoosh Farnadi Mila Quebec AI institute, Montreal, Canada School of Computer Science, Mc Gill University, Montreal, Canada Université de Montréal, Montreal, Canada |
| Pseudocode | Yes | Algorithm 1: R-DPCFL |
| Open Source Code | No | The paper mentions "codes: here" but does not provide a concrete URL or specific repository link in the text itself. The word "here" without a hyperlink or explicit URL is not sufficient to meet the requirement for concrete access to source code. |
| Open Datasets | Yes | We evaluate our proposed method on three benchamark datasets, including: MNIST (Deng, 2012), FMNIST (Xiao et al., 2017) and CIFAR10 (Krizhevsky, 2009) |
| Dataset Splits | Yes | We evaluate our proposed method on three benchamark datasets, including: MNIST (Deng, 2012), FMNIST (Xiao et al., 2017) and CIFAR10 (Krizhevsky, 2009), with heterogeneous data distributions from covariate shift (rotation; Pi(x) varies across clusters) (Kairouz et al., 2021; Werner et al., 2023) and concept shift (label flip; Pi(y|x) varies across clusters) (Werner et al., 2023), which are the commonly used data splits for clustered FL (see Appendix C). ... This is done for clients in the training set and test set separately. |
| Hardware Specification | No | The paper mentions "Compute Canada" as providing facilities but does not specify any particular GPU/CPU models, memory, or detailed computer specifications used for the experiments. |
| Software Dependencies | No | The paper discusses algorithms and mechanisms like DPSGD, GMM, Rényi-DP accountant, and exponential mechanism but does not list any specific software libraries with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | There are E communication rounds indexed by e and K local epochs with learning rate ηl during each round. ... c is a clipping threshold ... δ is fixed to 10 4 in this work ... The results are obtained on CIFAR10 from Rényi-DP accountant (Mironov et al., 2019) in a setting with Ni = 6600, ϵ = 5, δ = 10 4, c = 3, K = 1, E = 200, p = 11, 181, 642, ηl = 5 10 4. ... For the results reported so far, we used b>1 i = 32 for all experiments with R-DPCFL. ... Ec = (1 MPO) E / 2 , which is used in this work |