Change Point Detection in Dynamic Graphs with Decoder-only Latent Space Model

Authors: Yik Lun Kei, Jialiang Li, Hangjian Li, Yanzhen Chen, OSCAR HERNAN MADRID PADILLA

TMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulation studies show good performance of the latent space model in supporting change point detection and real data experiments yield change points that align with significant events.
Researcher Affiliation Collaboration Yik Lun Kei EMAIL Department of Statistics University of California, Santa Cruz; Jialiang Li EMAIL Department of Computer Science New Jersey Institute of Technology; Hangjian Li EMAIL Walmart Global Tech; Yanzhen Chen EMAIL Department of Information Systems, Business Statistics and Operations Management Hong Kong University of Science and Technology; Oscar Hernan Madrid Padilla EMAIL Department of Statistics and Data Science University of California, Los Angeles
Pseudocode Yes Algorithm 1 Latent Space Group Fused Lasso
Open Source Code Yes The codes are available at https://github.com/allenkei/CPD_generative.
Open Datasets Yes The Massachusetts Institute of Technology (MIT) cellphone data (Eagle & Pentland, 2006) depicts human interactions via phone call activities among n = 96 participants spanning T = 232 days.; The Enron email data, analyzed by Priebe et al. (2005), Park et al. (2012), and Peel & Clauset (2015), portrays the communication patterns among employees before the collapse of a giant energy company.
Dataset Splits Yes In particular, we split the original time series of graphs into training and testing sets: the training set consists of graphs at odd indexed time points and the testing set consists of graphs at even indexed time points. For the MIT cellphone data with T = 232, we remove the graphs at time t equals to multiples of t = {15, 20, 25, 30} respectively. For the Enron Email data with T = 100, we remove the graphs at time t equals to multiples of t = {3, 6, 9, 12} respectively.
Hardware Specification Yes We run our experiment with Tesla T4 GPU.
Software Dependencies No The paper mentions the 'Adam optimizer' and 'nett package (Amini et al., 2013) in R' but does not provide specific version numbers for any software components or libraries.
Experiment Setup Yes For Langevin Dynamic sampling, we set δ = 0.5, and we draw s = 200 samples for each time point t. To detect change points using the data-driven threshold in (15), we let the tuning parameter λ = {10, 20, 50, 100}. To detect change points using the localizing method with Gamma distribution in (13), we let the tuning parameter λ = {5, 10, 20, 50}. For each λ, we run A = 50 iterations of ADMM. Within each ADMM iteration, we run B = 20 iterations of gradient descent with Adam optimizer for the graph decoder and D = 20 iterations of block coordinate descent for Group Lasso. We set ϵtol = 10−5 and a = 5. Throughout, we initialize the penalty parameter κ = 10.