DistillHGNN: A Knowledge Distillation Approach for High-Speed Hypergraph Neural Networks

Authors: Saman Forouzandeh, Parham Moradi Dowlatabadi, Mahdi Jalili

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on several real-world datasets demonstrate that our method significantly reduces inference time while maintaining accuracy comparable to HGNN, and it achieves higher accuracy than state-of-the-art techniques, like Light HGNN, with a similar inference time.
Researcher Affiliation Academia Saman Forouzandeh, Parham Moradi & Mahdi Jalili School of Engineering RMIT University Melbourne, Australia EMAIL
Pseudocode Yes The algorithm of the proposed method is provided in Algorithm 1 in the Appendix A.
Open Source Code No The baseline experiments and our methods are implemented using Py Torch and the Deep Hypergraph library 1. 1https://github.com/i Moon Lab/Deep Hypergraph. This reference points to a third-party library used, not the authors' specific implementation code for Distill HGNN.
Open Datasets Yes The Cora dataset, introduced by Sen et al. (Sen et al., 2008), and the Citeseer dataset, developed by Giles et al. (Giles et al., 1998), have been transformed into hypergraph datasets, namely CC-Cora and CC-Citeseer, by Yadati et al. (Yadati et al., 2019). We include the complete IMDB dataset and its subset IMDBAW from Fu et al. (Fu et al., 2019)... Additionally, we incorporate the complete DBLP dataset and its three subsets: DBLP-Paper, DBLP-Term, and DBLP-Conf, as introduced by Sun et al. (Sun et al., 2011).
Dataset Splits Yes For each dataset, 20% of the data was used for validation, and 10% was used for testing.
Hardware Specification No The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running its experiments. It only mentions that the methods are implemented using PyTorch.
Software Dependencies No The baseline experiments and our methods are implemented using Py Torch and the Deep Hypergraph library 1. While software names are mentioned, specific version numbers for PyTorch or the Deep Hypergraph library are not provided.
Experiment Setup Yes In the context of the proposed Distill HGNN method, several hyperparameters play key roles in controlling the trade-offs between different components, the learning process, and overall performance. Below is a brief explanation of the most important hyperparameters: 1. Temperature for contrastive learning (τ): ... 2. Contrastive loss weight (γ): ... 3. Distillation loss weight (λ): ... 4. Learning Rate lr: ... 5. Embedding Dimension: ... Table 6 shows the results of evaluating the Distill HGNN framework based on different configurations of hyperparameters and their corresponding accuracy metrics.