Perceptually Constrained Precipitation Nowcasting Model

Authors: Wenzhi Feng, Xutao Li, Zhe Wu, Kenghong Lin, Demin Yu, Yunming Ye, Yaowei Wang

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We theoretically demonstrate the reliability of our solution, and experimental results on two publicly available radar datasets demonstrate that our model is effective and outperforms current state-of-the-art models.
Researcher Affiliation Academia 1School of Computer Science and Technology, Harbin Institute of Technology, Shenzhen, China 2Pengcheng Laboratory, Shenzhen, China 3Shenzhen Key Laboratory of Internet Information Collaboration, School of Computer Science and Technology, Harbin Institute of Technology, Shenzhen, China. Correspondence to: Xutao Li <EMAIL>.
Pseudocode Yes Algorithm 1 Training Process Input: data X = [xi]Lin i=1, xi RH W C initial: k, τ repeat
Open Source Code No The paper does not contain any explicit statement about releasing source code or a link to a code repository.
Open Datasets Yes To validate the effectiveness of the proposed model, we conduct experiments on two real precipitation datasets... SEVIR: The SEVIR dataset (Veillette et al., 2020) includes satellite images, NEXRAD VIL radar echograms, and lightning data... Meteo Net: The Meteo Net dataset (Larvor et al., 2020) consists of radar and satellite images, ground-based observations, and meteorological data.
Dataset Splits Yes Both datasets are divided into training, validation, and test sets. SEVIR: The data is split into training (October 2018 August 2019), validation (September October 2019), and test (October November 2019) sets. Meteo Net: The data is divided into training (October 2016 August 2019), validation (September October 2019), and test (September October 2019) sets.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments.
Software Dependencies No The paper mentions the "Adam W optimizer" but does not specify version numbers for any programming languages or libraries (e.g., Python, PyTorch, TensorFlow, CUDA).
Experiment Setup Yes We employ a cosine learning rate schedule to train the model, with a maximum learning rate of 1e-4 and a minimum learning rate of 1e-7. The warm-up ratio is set to 20%, with the warm-up learning rate set to 3e-4. The model is trained for 100 epochs using the Adam W optimizer.