Lie Algebra Canonicalization: Equivariant Neural Operators under arbitrary Lie Groups

Authors: Zakhar Shumaylov, Peter Zaika, James Rowbottom, Ferdia Sherry, Melanie Weber, Carola-Bibiane Schönlieb

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we showcase Lie LAC s efficacy on tasks of invariant image classification and Lie point symmetry equivariant neural PDE solvers using pre-trained models. ... Table 1: MNIST test accuracy for Affine and Homography groups... Table 2: ACE Test error evaluated over 90 trajectories... Figure 2: Canonicalization pipeline for numerical PDE evolution...
Researcher Affiliation Academia Zakhar Shumaylov ,1, Peter Zaika ,1, James Rowbottom1, Ferdia Sherry1, Melanie Weber2, Carola-Bibiane Schönlieb1 1University of Cambridge, 2Harvard University, Equal contributions
Pseudocode Yes Algorithm 1: Canonicalization with a global retraction... Algorithm 2: Canonicalization function via Lie algebra descent... Algorithm 3: Canonicalization function via coordinate Lie algebra descent
Open Source Code No The paper does not explicitly state that the authors' code is open-source or provide a link to a repository for their specific implementation. It mentions using third-party libraries and models like Deep XDE (Lu et al., 2021) and POSEIDON (Herde et al., 2024), but not their own code.
Open Datasets Yes We showcase the effectiveness of our method on image classification tasks...on image classification with respect to the groups of affine and homography transformations...on the affine-perturbed MNIST (aff NIST) (Gu & Tresp, 2020) and the homography-perturbed MNIST (hom NIST) (Mac Donald et al., 2022). ...For Burgers equation we take the code and pretrained Physics Informed Deep ONet (PIDeep ONet) model weights provided by Wang et al. (2021b). This includes a model trained on Nf,train = 1000 initial conditions sampled from a Gaussian random field (GRF)...
Dataset Splits Yes Training data consists of Nf,train initial condition functions, evaluated at Nic = 200 points at t = 0 to evaluate u0, Nbc = 100 points on the boundaries x = 0 or x = 2π to enforce the periodic boundary conditions and finally Ndom = 500 points in the interior of the domain where the residual physics loss is enforced. ...Table 2: ACE Test error evaluated over 90 trajectories for in-dsitribution (ID) and out-ofdistribution (OOD).
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments, such as specific GPU or CPU models.
Software Dependencies No The paper mentions using 'Deep XDE (Lu et al., 2021)' and 'POSEIDON (Herde et al., 2024)' models/libraries but does not specify version numbers for these or other software dependencies.
Experiment Setup Yes The model is then trained using the physics loss with sampling weights of αPINN = 150, γdata = 20, we use a learning rate of 0.001 and batch size of 8 and train for 100k epochs following Akhound Sadegh et al. (2023).