Risk-Sensitive Diffusion: Robustly Optimizing Diffusion Models with Noisy Samples

Authors: Yangming Li, Max Ruiz Luyten, Mihaela van der Schaar

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To verify the effectiveness of our method, we have conducted extensive experiments on multiple tabular and time-series datasets, showing that risk-sensitive SDE permits a robust optimization of diffusion models with noisy samples and significantly outperforms previous baselines.
Researcher Affiliation Academia Yangming Li, Max Ruiz Luyten, Mihaela van der Schaar Department of Applied Mathematics and Theoretical Physics University of Cambridge EMAIL
Pseudocode Yes Algorithm 1 Training and Algorithm 2 Sampling are present in Section 3.3 of the paper, detailing the procedures for optimization and sampling.
Open Source Code Yes We have publicly released the code at https://github.com/LeePleased/rdm.
Open Datasets Yes We adopt 2 medical time series datasets: MIMIC-III (Johnson et al., 2016) and WARDS (Alaa et al., 2017). [...] We adopt 3 UCI datasets (Asuncion & Newman, 2007): Abalone, Telemonitoring, and Mushroom.
Dataset Splits No The paper mentions training on datasets but does not explicitly provide details about training/test/validation splits, their percentages, or sample counts for reproducibility.
Hardware Specification No The paper does not explicitly mention the specific hardware (e.g., GPU models, CPU types, memory) used for running the experiments.
Software Dependencies No The paper does not explicitly mention specific software dependencies with version numbers used for implementing or running the experiments.
Experiment Setup No The paper does not explicitly provide specific experimental setup details such as concrete hyperparameter values (e.g., learning rates, batch sizes, number of epochs) or detailed optimizer settings in the main text.