Arbitrarily-Conditioned Multi-Functional Diffusion for Multi-Physics Emulation

Authors: Da Long, Zhitong Xu, Guang Yang, Akil Narayan, Shandian Zhe

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of ACM-FD across several fundamental multi-physics systems. The code is released at https://github.com/ Bayesian AIGroup/ACM-FD. (...) Experiments: We evaluated ACM-FD on four fundamental multi-physics systems. (...) Finally, a series of ablation studies confirmed the effectiveness of the individual components of our method.
Researcher Affiliation Academia 1Kahlert School of Computing, University of Utah 2Department of Mathematics, University of Utah 3Scientific Computing and Imaging Institute, University of Utah. Correspondence to: Shandian Zhe <EMAIL>.
Pseudocode Yes Algorithm 1 Training(Z1, . . . , ZM, p(H)) (...) Algorithm 2 Generation (conditioned: Fc, target: Fs, target locations: Z s, all locations: Z = {Zk})
Open Source Code Yes The code is released at https://github.com/ Bayesian AIGroup/ACM-FD.
Open Datasets Yes We used the 2D diffusion-reaction dataset provided from PDEBench (Takamoto et al., 2022).
Dataset Splits Yes We utilized 1,000 instances for training, 100 instances for validation, and 200 instances for testing.
Hardware Specification Yes All runtime experiments were conducted on a Linux cluster node equipped with an NVIDIA A100 GPU (40GB memory).
Software Dependencies No The paper mentions implementing parts in PyTorch ("reimplemented it using Py Torch. Our method, ACM-FD, was implemented with Py Torch as well.") and using tools like MATLAB PDE solver pdepe1 and scipy.interpolate.griddata, but does not provide specific version numbers for these software components.
Experiment Setup Yes Hyperparameter tuning was performed using the validation set. The details are provided in Appendix Section B. (...) In Appendix Section B: ACM-FD: the hyperparameters include the number of modes, which varies from {12, 16, 18, 20, 24}, the number of channels for channel lifting, which varies from {64, 128, 256}, the number of Fourier layers from, which varies from {3, 4, 5}, the length-scale of the SE kernel, which varies from {1e-2, 5e-3, 1e-3, 5e-4, 1e-4}. We used GELU activation.