Information Bottleneck-guided MLPs for Robust Spatial-temporal Forecasting

Authors: Min Chen, Guansong Pang, Wenjun Wang, Cheng Yan

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Comprehensive experimental results show that an excellent trade-off between the robustness and the efficiency can be achieved by RSTIBMLP compared to state-of-the-art STGNNs and MLP models. Comprehensive experiments on STF benchmark datasets from various domains under both noisy and clean evaluations demonstrate that 1) RSTIB-MLP achieves better, or comparably good, robustness, compared with state-of-the-art (SOTA) STGNNs, while being substantially more computationally efficient, and 2) RSTIB-MLP is much more robust than SOTA MLP-based models while being comparably efficient.
Researcher Affiliation Academia Min Chen * 1 Guansong Pang * 2 Wenjun Wang 1 Cheng Yan 1 ... 1Tianjin University 2Singapore Management University. Correspondence to: Cheng Yan <EMAIL>.
Pseudocode Yes Algorithm 1 RSTIB-MLP for Spatial-Temporal Forecasting
Open Source Code Yes Our code is publicly available at https://github.com/mchen644/RSTIB.
Open Datasets Yes Datasets. For demonstrating universality, we consider six datasets from different domains, including PEMS04, PEMS07, PEMS08 (Fang et al., 2021; Guo et al., 2019; Song et al., 2020; Yu et al., 2018), Large ST(SD) (Liu et al., 2024a), Weather2K-R (Zhu et al., 2023b), Electricity (Deng et al., 2021). ... Public accessed data can be found in (Guo et al., 2021b): https://github.com/guoshnBJTU/ASTGNN/tree/main/data ... Large ST (Liu et al., 2024a): It is publicly available at https://github.com/liuxu77/Large ST. Weather2K-R (Zhu et al., 2023b): It is publicly available at https://github.com/bycnfz/weather2k. Electricity (Deng et al., 2021): It is publicly available at https://github.com/JLDeng/ST-Norm.
Dataset Splits Yes For Electricity dataset, we adopt the same training, validation, and testing split ratio as in (Deng et al., 2021), and for other datasets, we adopt 6:2:2 for all datasets to ensure consistency.
Hardware Specification Yes All evaluations are conducted on an NVIDIA RTX 3090Ti GPU. ... The models are trained on NVIDIA Ge Force RTX 3090Ti GPUs
Software Dependencies Yes We adopt Py Torch 1.13.1 on NVIDIA RTX 3090Ti GPUs, utilizing the Py Torch framework (Paszke et al., 2019).
Experiment Setup Yes Implementation Details. For the basic settings, we employ a hidden dimension d = 64 and utilize an MLP architecture with L=3 layers. For PEMS and Large TS(SD) benchmark datasets, we use historical traffic flow data with window length P = 12 to forecast future traffic flow data with window length F = 12, while for the Electricity dataset, we follow the default settings in (Deng et al., 2021), i.e., we set P=16 and F=3, and calculate the average predictive accuracy by averaging over 1, 2, 3 hours. ... The learning rate is initialized as η = 0.002 with a decay factor r = 0.5. ... A summary of the default hyperparameter settings is in Table 9.