Physics-Informed Diffusion Models

Authors: Jan-Hendrik Bastek, WaiChing Sun, Dennis Kochmann

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our approach reduces the residual error by up to two orders of magnitude compared to previous work in a fluid flow case study and outperforms task-specific frameworks in relevant metrics for structural topology optimization. We also present numerical evidence that our extended training objective acts as a natural regularization mechanism against overfitting.
Researcher Affiliation Academia Jan-Hendrik Bastek Dept. of Mechanical and Process Eng. ETH Zurich Zurich, Switzerland EMAIL; Wai Ching Sun Dept. of Civil Eng. and Eng. Mechanics Columbia University New York, NY, USA EMAIL; Dennis M. Kochmann Dept. of Mechanical and Process Eng. ETH Zurich Zurich, Switzerland EMAIL
Pseudocode Yes Algorithm 1 Physics-informed diffusion model training
Open Source Code Yes Code is available at https://github.com/jhbastek/Physics Informed Diffusion Models.
Open Datasets Yes We again consider a square domain Ω= [0, 1]2 and benchmark our proposed PIDM to state-of-the-art frameworks (Mazé & Ahmed, 2023; Giannone et al., 2023) that also provide a dataset consisting of 30,000 optimized structures with various boundary conditions and volume constraints and two proposed test scenarios with inand out-of-distribution boundary conditions.
Dataset Splits Yes We create a training and a validation dataset of 10,000 and 1,000 datapoints, respectively, by solving the governing equations (see equation 29) for a sampled permeability field on a 64 64 grid.
Hardware Specification Yes All models were trained on a single Nvidia Quadro RTX Quadro RTX 6000 GPU equipped with 24GB GDDR6 memory.
Software Dependencies No The model is implemented and trained using Py Torch (Paszke et al., 2019). Finite difference stencils are implemented via torch.nn.Conv2D (Paszke et al., 2019) with a custom kernel, which we can precompute for stencils up to arbitrary order via findiff (Baer, 2018). We solve for the over-determined pressure field using the scipy.linalg.lstsq (Virtanen et al., 2020) solver with default settings.
Experiment Setup Yes We train the model for 400 epochs on 10,000 randomly sampled points of the unit circle, using the Adam optimizer (Kingma & Ba, 2014) with a learning rate of 5 10 4. We use a batch size of 128, and 100 diffusion timesteps with a cosine scheduler (Dhariwal & Nichol, 2021).