SeqLink: A Robust Neural-ODE Architecture for Modelling Partially Observed Time Series

Authors: Futoon M. Abushaqra, Hao Xue, Yongli Ren, Flora D. Salim

TMLR 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments on partially observed synthetic and real-world datasets, we demonstrate that Seq Link improves the modelling of intermittent time series, consistently outperforming state-of-the-art approaches. The paper includes a dedicated "Experiments" section (Section 5), with sub-sections for "Datasets," "Experiment Details," and "Model Performance," featuring tables of MSE and AUC values comparing Seq Link to various baselines, as well as an "Ablation Study" (Section 5.4).
Researcher Affiliation Academia Futoon M.Abushaqra EMAIL School of Computing Technologies RMIT University; Hao Xue EMAIL School of Computer Science Engineering University of New South Wales; Yongli Ren EMAIL School of Computing Technologies RMIT University; Flora D.Salim EMAIL School of Computer Science Engineering University of New South Wales. All authors are affiliated with universities (RMIT University, University of New South Wales) and use academic email domains (.edu.au).
Pseudocode Yes Algorithm 1: Pyramidal sorting; Algorithm 2: Link-ODE
Open Source Code Yes Our code is available at (https://github.com/Ftoon Abushaqra/Seq Link.git)
Open Datasets Yes Electricity Consumption Load (ECL)1: A public dataset...https://archive.ics.uci.edu/ml/datasets/Electricity Load Diagrams20112014; Electricity Transformer Temperature (ETT)2:...https://github.com/zhouhaoyi/ETDataset; Weather Data3:...https://www.ncei.noaa.gov/ data/local-climatological-data; MIMIC-II (Physio Net Challenge 2012 data) (Silva et al., 2012)4.
Dataset Splits Yes For all the experiments, we applied a shuffled splitting to divide the data into a training set and a testing set. 80% of the samples were used for training, while the testing set held the remaining 20%.
Hardware Specification Yes The experiments were run on a desktop with an NVIDIA Ge Force MX230.
Software Dependencies No The paper mentions "Adam optimizer" and "the torchdiffeq python package" and "Py Torch" but does not specify version numbers for these software components. For example, it does not state "PyTorch 1.9" or "torchdiffeq 0.2".
Experiment Setup Yes For model hyperparameters, to make the experiments fair and consistent, we followed (Rubanova et al., 2019) and chose the hyperparameters that yield the best performance for the original ODE-RNN. We ran both baselines and Seq Link with the exact same size of the hidden state and the same number of layers and units. ... The latent dimension used was 10 for all data sets. The fifth-order dopri5 solver from the torchdiffeq python package was used as the ODE solver. ... We run 200 epochs on a batch size of 200. The same settings are used for all models (Seq Link and the baselines). Finally, We used Adam optimizer and a learning rate of 0.01.