GIT-Net: Generalized Integral Transform for Operator Learning

Authors: Chao Wang, Alexandre H. Thiery

TMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems. This stands in contrast to existing neural network operators, which typically excel in just one of these areas.
Researcher Affiliation Academia Chao Wang EMAIL Department of Statistics and Data Science National University of Singapore Alexandre Hoang Thiery EMAIL Department of Statistics and Data Science National University of Singapore
Pseudocode No The paper describes procedures and architectures using prose and diagrams (e.g., Section 2.1 describes the GIT mapping in numbered steps, and Figure 1 illustrates it), but it does not contain a formally labeled pseudocode or algorithm block with structured steps.
Open Source Code Yes Codes and datasets are publicly available 1. 1Github: https://github.com/chaow-mat/General_Integral_Transform_Neural_Network
Open Datasets Yes Codes and datasets are publicly available 1. 1Github: https://github.com/chaow-mat/General_Integral_Transform_Neural_Network
Dataset Splits Yes To evaluate the performance of the different methods, four training datasets of respective size Ntrain {2500, 5000, 10000, 20000} were generated for training; the methods were evaluated on independent test sets.
Hardware Specification No The paper does not provide specific hardware details such as GPU models, CPU types, or memory specifications used for running the experiments. It discusses computational costs but not the hardware on which these computations were performed.
Software Dependencies No The paper mentions the ADAM optimizer (Kingma & Ba, 2015) and GELU nonlinearity (Hendrycks & Gimpel, 2016), as well as automatic differentiation frameworks (Bradbury et al., 2018; Paszke et al., 2017) which refer to JAX and PyTorch, but it does not specify version numbers for any of these software components.
Experiment Setup Yes The numerical experiments presented in Section 5, we used the ADAM optimizer (Kingma & Ba, 2015). The number L 1 of GIT layers is fixed at L = 3 and we used C {2, 4, 8, 16, 32} and K {16, 64, 128, 256, 512}. The GELU nonlinear activation function (Hendrycks & Gimpel, 2016) was used to implement the GIT-Net. Following the setup of the Fourier Neural Operator (FNO) in de Hoop et al. (2022), we used twelve Fourier modes and three Fourier Neural Layers (L = 3 in (20)). For the PCA-Net, as in de Hoop et al. (2022), we use four internal layers (4-layer MLP in (17)).