Co-Dream: Collaborative Dream Synthesis over Decentralized Models
Authors: Abhishek Singh, Gauri Gupta, Yichuan Shi, Alex Dang, Ritvik Kapila, Sheshank Shankar, Mohammed Ehab, Ramesh Raskar
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically demonstrate the effectiveness of Co-Dream and compare its performance with existing techniques. ... We empirically validate our framework by benchmarking with existing algorithms and conducting ablation studies across various design choices. |
| Researcher Affiliation | Collaboration | 1Massachusetts Institute of Technology 2Amazon 3Tesla AI EMAIL |
| Pseudocode | Yes | Algorithm 1: Co-Dream Algorithm |
| Open Source Code | Yes | Code https://mitmedialab.github.io/codream.github.io/ |
| Open Datasets | Yes | We conduct our experiments on 3 real-world datasets, including MNIST (Le Cun et al. 1998), SVHN (Netzer et al. 2011), and CIFAR10 (Krizhevsky, Hinton et al. 2009). |
| Dataset Splits | Yes | To validate the effect of collaboration, we train clients with 50 samples per client for MNIST and 1000 samples per client for CIFAR10 and SVHN datasets. ... We use Dirichlet distribution Dir(α) to generate non-IID data partition among labels for a fixed number of total samples at each client. |
| Hardware Specification | No | The paper does not explicitly mention specific hardware details such as GPU models, CPU types, or cloud computing platforms used for experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | Unless stated otherwise, we used Res Net18 (He et al. 2015) for training the client and server models and set the total number of clients K = 4. ... Instead of 2000 global aggregation rounds (R) in Co-Dream, Co Dream-fast performs only a single global aggregation round with 5 local rounds. |