Semi-Implicit Neural Ordinary Differential Equations

Authors: Hong Zhang, Ying Liu, Romit Maulik

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate and evaluate the performance of our methods when learning stiff ODE systems. Throughout this section we compare our methods with a variety of baseline methods including explicit and implicit methods. The neural network architectures we use follow the best settings identified in previous works (Linot et al. 2022; Chamberlain et al. 2021). The only modification necessary for using our method is to split the ODE right-hand side. Code is available at https://github.com/caidao22/pnode. 5.1 Graph Classification with GRAND ... 5.2 Learning Dynamics for the Kuramoto Sivashinsky (KS) Equation ... 5.3 Learning Dynamics for the Viscous Burgers Equation
Researcher Affiliation Academia Hong Zhang 1, Ying Liu 2, Romit Maulik 1,3 1Argonne National Laboratory 2University of Iowa 3Pennsylvania State University EMAIL, EMAIL, EMAIL
Pseudocode No The paper describes the forward and backward passes using equations and detailed explanations, but does not present a structured pseudocode or algorithm block.
Open Source Code Yes Code is available at https://github.com/caidao22/pnode.
Open Datasets Yes For assessment, we choose three benchmark datasets: Cora, Coauthor CS, and Photo.
Dataset Splits No The paper mentions using benchmark datasets like Cora, Coauthor CS, and Photo, and refers to a 'testing dataset' for the viscous Burgers equation, but does not specify the training/validation/test splits used for these experiments.
Hardware Specification No The paper mentions using computing resources provided by the Joint Laboratory for System Evaluation (JLSE) at Argonne National Laboratory, but does not provide specific hardware details such as GPU/CPU models or memory amounts.
Software Dependencies No SINODE is implemented in the PNODE framework (Zhang and Zhao 2022) that integrates Py Torch and PETSc seamlessly. We have implemented these algorithms as off-the-shelf solvers in PETSc (Balay et al. 2023).
Experiment Setup Yes Because of the stability constraints, we have to utilize a step size of 0.005 for explicit methods, while the implicit methods and the IMEX methods allow us to use a step size of 1 thanks to their superior stability properties. A time step size 0.2 is used for the IMEX methods and the fully implicit method. ... we choose a time step size of 0.05 as a conservative choice for the four IMEXRK methods and the fully implicit method.