Transolver++: An Accurate Neural Solver for PDEs on Million-Scale Geometries

Authors: Huakun Luo, Haixu Wu, Hang Zhou, Lanxiang Xing, Yichen Di, Jianmin Wang, Mingsheng Long

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimentally, Transolver++ yields 13% relative promotion across six standard PDE benchmarks and achieves over 20% performance gain in million-scale highfidelity industrial simulations... We extensively evaluate Transolver++ on six standard benchmarks and two industrial-level datasets...
Researcher Affiliation Academia 1School of Software, BNRist, Tsinghua University. Huakun Luo <EMAIL>. Correspondence to:Mingsheng Long<EMAIL>.
Pseudocode Yes Algorithm 1 Parallel Physics-Attention with Eidetic States
Open Source Code Yes Code is available at https://github.com/thuml/Transolver plus.
Open Datasets Yes To further evaluate the model s efficacy in real applications, we also perform experiments on industrial design tasks, where we utilized Driv Aer Net++ (Elrefaie et al., 2024) for car design and a newly simulated Air Craft dataset for 3D aircraft design.
Dataset Splits Yes Table 9. Details of different benchmarks, including geometric type, number of mesh points, as well as the type of input and output, etc. The split of the dataset is also provided to ensure reproducibility, which is listed in the order of (training samples, test samples). ELASTICITY (1000, 200) PLASTICITY (900, 80) AIRFOIL (1000, 200) PIPE (1000, 200) NAVIER-STOKES 2D (1000, 200) DARCY (1000, 200) DRIVAERNET++ (190, 10) AIRCRAFT (140, 10)
Hardware Specification Yes Figure 1. (a) Comparison of model capability in handling large geometries. We plot the GPU memory change of each model when increasing input mesh points. The upper bound on a single A100 40GB GPU is depicted in the dotted line. ...our model is capable of handling 2.5 million meshes within 4 A100 GPUs...
Software Dependencies No The paper mentions optimizers like ADAMW (2019) and ADAM (2015), but does not provide specific version numbers for programming languages or libraries (e.g., Python, PyTorch, TensorFlow, CUDA).
Experiment Setup Yes Table 10. Implementation details of Transolver++ including training and model configuration. Training configurations are identical to previous methods (Wu et al., 2024; Hao et al., 2023; Deng et al., 2024; Elrefaie et al., 2024) and shared in all baselines. LOSS, EPOCHS, INITIAL LR, OPTIMIZER, BATCH SIZE, LAYERS, HEADS, CHANNELS, SLICES are all listed with specific values.