Closed-form Solutions: A New Perspective on Solving Differential Equations
Authors: Shu Wei, Yanjie Li, Lina Yu, Weijun Li, Min Wu, Linjun Sun, Jingyi Liu, Hong Qin, Yusong Deng, Jufeng Han, Yan Pang
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Evaluations across a diverse set of ordinary and partial differential equations demonstrate that SSDE outperforms existing machine learning methods, delivering superior accuracy and efficiency in obtaining analytical solutions. |
| Researcher Affiliation | Academia | 1Ann Lab, Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China 2College of Materials Science and Opto Electronic Technology, University of Chinese Academy of Sciences, Beijing, China 3School of Integrated Circuits, University of Chinese Academy of Sciences, Beijing, China 4School of Industry-education Integration, University of Chinese Academy of Sciences, Beijing, China. Correspondence to: Lina Yu <EMAIL>, Min Wu <EMAIL>. |
| Pseudocode | Yes | Pseudocode for the proposed algorithms is included in the appendix A. |
| Open Source Code | Yes | To facilitate reproducibility and further research, we provide the complete implementation of SSDE, including training scripts, benchmark configurations, and symbolic expression evaluation tools. The source code is publicly available at: https://github.com/Hintonein/SSDE |
| Open Datasets | Yes | In addition, all benchmark datasets constructed for this study, including the high-dimensional PDEs and Poisson problems derived from Nguyen expressions, are included in the repository. The code is implemented in Python and relies on Py Torch and standard scientific computing libraries. |
| Dataset Splits | No | The paper does not explicitly provide training/test/validation dataset splits. Algorithm 1 mentions 'Randomly sample the domain to build the dataset D = {xi f}NF i=1 {xi b, ui b}NB i=1 {xi 0, ui 0}NI i=1' for evaluating candidate solutions, which describes sampling points from a domain, but not splitting a global dataset of problems or specific percentages for data partitioning. |
| Hardware Specification | Yes | All experiments reported in this work were conducted on an Intel(R) Xeon(R) Gold 6138 CPU @ 2.00GHz. The algorithm implementation can also leverage GPU acceleration for improved computational efficiency. |
| Software Dependencies | No | The code is implemented in Python and relies on Py Torch and standard scientific computing libraries. It does not provide specific version numbers for Python, PyTorch, or other libraries. |
| Experiment Setup | Yes | The optimal hyperparameters identified through this tuning process are summarized in Table 5, and these configurations were consistently used across all benchmark experiments. Table 5 includes: LEARNING RATE η 0.0010, ENTROPY WEIGHT λH 0.07, ENTROPY GAMMA γ 0.7, RNN CELL SIZE 32, RNN CELL LAYERS 1, RISK FACTOR ϵ 0.05, MAX EPOCH M 200, BATCH SIZE N 1000, PDE CONSTRAINT WEIGHT λ0 1, BC CONSTRAINT WEIGHT λ1 1, IC CONSTRAINT WEIGHT λ2 1. |