Graph Pooling via Ricci Flow
Authors: Amy Feng, Melanie Weber
TMLR 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we present experiments to demonstrate the advantage of our proposed pooling layer ORC-Pool. We test our hypothesis that encoding local and global geometric information into the pooling layers can increase the accuracy of the GNN in downstream tasks. Experimental setup. We implement a simple GNN architecture, consisting of blocks of GCN base layers, followed by a pooling layer. ... Node Clustering. We compare the performance of ORC-Pool and other pooling layers for node clustering, where we evaluate the Normalized Mutual Information (short NMI, defined in sec. D.2) of the cluster assignments computed by the GNN, as well as the average runtime per epoch. ... Graph classification. We further compare the performance of ORC-Pool with that of other pooling layers for graph classification, where we report the accuracy of label assignments. |
| Researcher Affiliation | Academia | Amy Feng EMAIL Harvard College Melanie Weber EMAIL Harvard University |
| Pseudocode | No | The paper describes methods using mathematical equations and prose but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. It mentions: "The authors thank Yu Tian for sharing code." which refers to code from another researcher, and lists licenses for third-party libraries like PyTorch Geometric and PyTorch in the appendix, but no specific repository or release statement for the authors' own implementation of ORC-Pool. |
| Open Datasets | Yes | We utilize the popular benchmarks Planetoid (64) for node clustering, and TUDataset (37) and LRGBDataset (17) for graph classification. ... For node clustering, we use OGBN-ARXIV, and for graph classification, we use OGBG-MOLHIV. |
| Dataset Splits | Yes | Table 2: Graph Classification. Average classification accuracy for ORC-Pool in comparison with state of the art pooling layers, averaged over 10 trials, using a 80/10/10 train/val/test split. |
| Hardware Specification | Yes | All experiments are run on a NVIDIA A100 GPU with one CPU. |
| Software Dependencies | No | Experiments are performed using Py Torch Geometric. ... G.3 Licenses: Pytorch Geometric (21), Pytorch (44). |
| Experiment Setup | Yes | Hyperparameters. Parameters for the experiments are as follows: For Planetoid, one GCN layer with output dimension 8 and an ELU activation function is used to embed the graph. The optimizer is an Adam optimizer with a learning rate of 1e-2. The models are trained for at most 10000 epochs, or until the best NMI is found under a patience constraint. ... For the Planetoid graphs, patience is set to 100, and for Amazon-ratings (from Heterophilous), patience is set to 250. When applying ORC-Pool to the Planetoid graphs, four Ricci flow iterations are used, and for Amazon-ratings, two Ricci flow iterations are used. |