Graph Scattering beyond Wavelet Shackles

Authors: Christian Koke, Gitta Kutyniok

NeurIPS 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Theoretical results are complemented by numerical investigations: Suitably chosen scattering networks conforming to the developed theory perform better than traditional graph-wavelet based scattering approaches in social network graph classification tasks and significantly outperform other graph-based learning approaches to regression of quantum-chemical energies on QM7.
Researcher Affiliation Academia Christian Koke Technical University of Munich & Ludwig Maximilian University Munich EMAIL; Gitta Kutyniok Ludwig Maximilian University Munich & University of Tromsø EMAIL
Pseudocode No The paper describes its generalized scattering transform iteratively but does not include any structured pseudocode or algorithm blocks with formal labels.
Open Source Code Yes Yes; please see the supplementary material.
Open Datasets Yes To aid visual clarity when comparing results, we colour-code the best-performing method in green, the second-best performing in yellow and the third-best performing method in orange respectively. Social Network Graph Classification: To facilitate contact between our generalized graph scattering networks, and the wider literature, we combine a network conforming to our general theory namely Architecture I in Fig. 2 (as discussed in Section 3 with depth N 4, identity as connecting operators and | |-non-linearities) with the low pass aggregation scheme of Section 5 and a Euclidean support vector machine with RBF-kernel (GGSN+EK). The choice N 4 was made to keep computation-time palatable, while aggregation scheme and non-linearities were chosen to facilitate comparison with standard wavelet-scattering approaches. For this hybrid architecture (GGSN+EK), classification accuracies under the standard choice of 10-fold cross validation on five common social network graph datasets are compared with performances of popular graph kernel approaches, leading deep learning methods as well as geometric wavelet scattering (GS-SVM) [12]. More details are provided in Appendix K.
Dataset Splits Yes For this hybrid architecture (GGSN+EK), classification accuracies under the standard choice of 10-fold cross validation on five common social network graph datasets are compared with performances of popular graph kernel approaches, leading deep learning methods as well as geometric wavelet scattering (GS-SVM) [12]. More details are provided in Appendix K. ... trained with ten-fold cross validation on node and (depending on the model) edge level information.
Hardware Specification Yes All experiments were run on a single machine with a Ryzen 7 3700x processor, 64 GB of RAM, and a GeForce RTX 2080 Ti GPU.
Software Dependencies No The paper mentions the use of certain software components (e.g., in Appendix K about using code from [12]), but it does not specify any software dependencies with version numbers.
Experiment Setup Yes In both cases the utilized shift-operator is L : L{λmaxp Lq, node weights satisfy µi 1, the branching ratio in each layer is chosen as 4 and the depth is set to N 4 as well. The connecting operators are set to the identity and non-linearities are set to the modulus (| |). The two architectures differ in the utilized filters, which are repeated in each layer and depicted in Fig. 2. ... Our normal operator is then chosen as L L{λmaxp Lq again. Connecting operators are set to the identity, while non-linearities are fixed to ρně1p q | |. Filters are chosen as psinpπ{2 L q, cospπ{2 L q, sinpπ L q, cospπ L qq acting through matrix multiplication. Output generating functions are set to the identity and depth is N 4