ALMA: Alternating Minimization Algorithm for Clustering Mixture Multilayer Network

Authors: Xing Fan, Marianna Pensky, Feng Yu, Teng Zhang

JMLR 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Compared to TWIST, ALMA achieves higher accuracy, both theoretically and numerically. (...) Also, our numerical studies show that ALMA leads to smaller between-layer and within-layer clustering errors than TWIST. (...) Section 5.3 produces numerical comparisons between ALMA and TWIST via simulations, and also compares both of them with a simple baseline algorithm. Finally, Section 5.4 extends these comparisons to real data examples.
Researcher Affiliation Academia Xing Fan EMAIL Department of Mathematics University of Central Florida Orlando, FL 32816, USA Marianna Pensky EMAIL Department of Mathematics University of Central Florida Orlando, FL 32816, USA Feng Yu EMAIL Department of Mathematics University of Central Florida Orlando, FL 32816, USA Teng Zhang EMAIL Department of Mathematics University of Central Florida Orlando, FL 32816, USA
Pseudocode Yes Algorithm 1 Alternating Minimization Algorithm (ALMA) (...) Algorithm 2 Initialization of Algorithm 1
Open Source Code No The paper does not contain any explicit statement about making the source code available, nor does it provide a link to a code repository. The license mentioned is for the paper itself, not for any accompanying software.
Open Datasets Yes The Worldwide food trading networks data have been described in De Domenico et al. (2015), and is available at https://www.fao.org/faostat/en/#data/TM. (...) We also analyze the airline-airport network data set, available at https://openflights.org/data.html#route
Dataset Splits No The paper describes simulation scenarios with varying parameters (n, L, pmax) and mentions averaging results over 100 independent simulation runs. For real data, it describes pre-processing steps. However, it does not specify explicit training, validation, or test dataset splits in the main text for the models evaluated.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used for running the simulations or real data experiments.
Software Dependencies No The paper does not mention any specific software dependencies, libraries, or frameworks with version numbers that were used to implement or run the experiments.
Experiment Setup Yes In our simulations, we set M = 3, K = 3 and r = 7 (...) we use identical connectivity matrices Bm B where the diagonal values are set to p = pmax while the off-diagonal entries are equal to q = αpmax with α < 1. (...) We choose the stopping criterion W(iter) W(iter 1) 10 4 for both of ALMA and TWIST (...) In Simulation 1, we investigate the effect of the network sparsity on the precision of the algorithms. For this purpose, we choose the number of vertices n = 100, the number of layers L = 40, the number of network clusters M = 3, the number of communities in each cluster of layers K = 3 and α = 0.9. The variable pmax, which controls the overall network sparsity, varies from 0.3 to 1.