Slimming the Fat-Tail: Morphing-Flow for Adaptive Time Series Modeling
Authors: Tianyu Liu, Kai Sun, Fuchun Sun, Yu Luo, Yuanlong Zhang
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments across eight datasets show that Mo F achieves state-of-the-art performance: With a simple linear backbone architecture, it matches the performance of state-of-the-art models on datasets such as Electricity and ETTh2. When paired with a patch-based Mamba architecture, Mo F outperforms its closest competitor by 6.3% on average and reduces forecasting errors in fat-tailed datasets such as Exchange by 21.7%. |
| Researcher Affiliation | Collaboration | 1Department of Computer Science and Technology, Institute for Artificial Intelligence, BNRist Center, Tsinghua University, Beijing, China 2School of Biomedical Engineering, Tsinghua University 3Huawei Noah s Ark Lab 4Department of Automation, Tsinghua University. |
| Pseudocode | Yes | Algorithm 1 Mo F Module Pseudocode Input: Input sequence x of shape [C, T] Output: Transformed sequence x of shape [C, T] Apply the Flow transformation to the input sequence: x = F(x | Wflow) Feed xflow into the temporal morph module: xmod = Morph(xflow) // xmod has shape B 2 Transform the sequence using the modified parameters: x = F(x | Wflow xmod) Return: x |
| Open Source Code | No | The text does not provide an explicit statement about the release of source code for the methodology described in this paper, nor does it provide a link to a code repository. |
| Open Datasets | Yes | Datasets: We conduct experiments on eight widely used multivariate time series forecasting datasets, including Electricity Transformer Temperature (ETTh1, ETTh2, ETTm1, and ETTm2) (Zhou et al., 2021), Electricity, Traffic, Weather (Wu et al., 2021), and Exchange Rate... Appendix C.1 provides specific links and citations for each dataset: 1https://arxiv.org/abs/2012.01655 2https://www.kaggle.com/datasets/jessemostipak/traffic 3https://archive.ics.uci.edu/ml/datasets/Electricity_Load_Diagrams2011_2014 4https://www.kaggle.com/datasets/loyolacoding/exchange-rate-data 5https://www.kaggle.com/datasets/muthuj7/weather-dataset 6https://www.cdc.gov/flu/weekly/index.htm |
| Dataset Splits | Yes | To ensure consistency, we follow the standard protocol (Liu et al., 2023) and split the datasets into training, validation, and test sets with a 6:2:2 ratio for ETT datasets and a 7:1:2 ratio for the remaining datasets. |
| Hardware Specification | Yes | All experiments were conducted using a single NVIDIA Titan RTX GPU paired with an Intel i9-10900KB CPU. |
| Software Dependencies | No | The paper mentions the use of 'fvcore package' for FLOPs measurement, but does not provide specific version numbers for any key software components or libraries used for implementing the methodology. |
| Experiment Setup | Yes | Evaluation Protocol: Following the standard Autoformer protocol (Wu et al., 2021)... The historical horizon length is set to T = 336, with prediction lengths F {96, 192, 336, 720}... Detailed hyperparameters for Mo F Linear and Mo F Mamba are included in Appendix D. D.2. Model Hyperparameters: ...the number of bins for the spline (spline_num_bins) is set to 24, the tail length of the spline (spline_tail) is set to 6.0, and the size of the b-matrix for Test Time Training (TT) is set to 92 for all the dataset. Additionally, the masking ratio p for self-supervised Test Time Training (TTT) is set to 0.5. |