Finite Expression Method for Solving High-Dimensional Partial Differential Equations
Authors: Senwei Liang, Haizhao Yang
JMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical results will be provided to demonstrate the effectiveness of our FEX implementation introduced in Section 3.2 using two classical PDE problems: high-dimensional PDEs with constraints (such as Dirichlet boundary conditions and integration constraints) and eigenvalue problems. The computational tools for high-dimensional problems are very limited and NNs are probably the most popular ones. Therefore, FEX will be compared with NN-based solvers. |
| Researcher Affiliation | Academia | Senwei Liang EMAIL Department of Mathematics and Statistics Texas Tech University Lubbock, TX 79409, USA; Haizhao Yang EMAIL Department of Mathematics and Department of Computer Science University of Maryland, College Park College Park, MD 20742, USA |
| Pseudocode | Yes | In Appendix B, we provide pseudo-code for the FEX algorithm, which employs expanding trees to search for a solution. For the reader s convenience, we have summarized the key notations used in this section in Table 1. |
| Open Source Code | Yes | Source codes for reproducing the results in this paper are available online at: https://github.com/Leung Sam Wai/Finite-expression-method. The source codes are released under MIT license. |
| Open Datasets | No | No data is generated in this work. |
| Dataset Splits | No | The paper does not use external datasets in the traditional sense, but rather solves partial differential equations (PDEs) numerically. Therefore, the concept of training/validation/test splits for a fixed dataset does not apply. The 'data' for the experiments are points sampled from the domain to evaluate the PDE, and it mentions 'batch size for the interior and boundary' for this purpose, not for dataset splits. |
| Hardware Specification | No | The paper does not provide specific hardware details such as CPU/GPU models, memory, or specific cloud resources used for running the experiments. It only mentions general computational tools. |
| Software Dependencies | No | The paper mentions several software components and algorithms like Adam, BFGS, and the Python package GPlearn, but it does not specify version numbers for any of these to ensure reproducibility. For example, it mentions 'Python package GPlearn (Stephens, 2017)' but without a version number for gplearn itself. |
| Experiment Setup | Yes | This part provides the setting of FEX and NN-based solvers. The depth-3 binary tree (Figure 2b) with 3 unary operators and 1 binary operator is used to generate mathematical expressions. The binary set is B = {+, -, *} and the unary set is U = {0, 1, Id, ()2, ()3, ()4, exp, sin, cos}. A fully connected NN is used as a controller χΦ with constant input (see Appendix C for more details). ... Score computation. The score is updated first by Adam with a learning rate 0.001 for T1 = 20 iterations and then by BFGS with a learning rate 1 for maximum T2 = 20 iterations. ... Controller update. The batch size for the policy gradient update is N = 10 and the controller is trained for 1000 iterations using Adam with a fixed learning rate 0.002. ... Candidate optimization. The candidate pool capacity is set to be K = 10. For any e P, the parameter θ is optimized using Adam with an initial learning rate 0.01 for T3 = 20,000 iterations. The learning rate decays according to the cosine decay schedule (He et al., 2019). ... Implements of NN-based Solvers. Residual networks (Res Nets) ... consists of seven fully connected layers with three skip connections. Each hidden layer contains 50 neurons. The neural network is optimized using the Adam optimizer with an initial learning rate of 0.001 for 15,000 iterations. The learning rate is decayed following a cosine decay schedule. |