Federated t-SNE and UMAP for Distributed Data Visualization

Authors: Dong Qiao, Xinxian Ma, Jicong Fan

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on multiple datasets demonstrate that, compared to the original algorithms, the accuracy drops of our federated algorithms are tiny.
Researcher Affiliation Academia School of Data Science, The Chinese University of Hong Kong, Shenzhen, China EMAIL, EMAIL, EMAIL
Pseudocode Yes Algorithm 1: Federated Distribution Learning
Open Source Code No The paper does not provide an explicit statement or link for open-source code for the described methodology.
Open Datasets Yes We applied the proposed Fed-t SNE and Fed-UMAP methods to the MNIST and Fashion-MNIST datasets, with m X = 40, 000, and set n Y = 500. Additionally, "We utilized three datasets MNIST, COIL-20, and Mice Protein (detailed in Appendix) to evaluate the effectiveness of our Fed-Spe Clust."
Dataset Splits Yes We designed the experiment with 10 clients, where IID (independent and identically distributed) refers to each client s data being randomly sampled from the MNIST dataset, thus including all classes. In contrast, non-IID means that each client s data contains only a single class.
Hardware Specification No The paper does not provide specific hardware details used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details, such as library names with version numbers, needed to replicate the experiment.
Experiment Setup Yes We applied the proposed Fed-t SNE and Fed-UMAP methods to the MNIST and Fashion-MNIST datasets, with m X = 40, 000, and set n Y = 500. We designed the experiment with 10 clients, where IID (independent and identically distributed) refers to each client s data being randomly sampled from the MNIST dataset, thus including all classes. In contrast, non-IID means that each client s data contains only a single class. In Figure 2, the relevant metrics reached convergence after approximately 50 epochs. The noise level β controls the scale of noise, with each element of noise E being drawn from N(0, β2sd2( fp(Yp))).