DeepSN: A Sheaf Neural Framework for Influence Maximization

Authors: Asela Hevapathige, Qing Wang, Ahad N. Zehmakan

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we conduct extensive experiments on both synthetic and real-world datasets to demonstrate the effectiveness of our framework.
Researcher Affiliation Academia Graph Research Lab, School of Computing, Australian National University EMAIL
Pseudocode No The paper describes the proposed framework and its components using mathematical formulations (e.g., equations 1-11) but does not include a clearly labeled pseudocode or algorithm block.
Open Source Code Yes Code https://github.com/Aselahp/Deep SN
Open Datasets Yes We evaluate Deep SN against other methods using a diverse set of datasets, including five real-world datasets (Jazz (Rossi and Ahmed 2015), Network Science (Rossi and Ahmed 2015), Cora-ML (Mc Callum et al. 2000), Power Grid (Rossi and Ahmed 2015), and Digg (Lerman and Galstyan 2008)) and one synthetic dataset (Random (Ling et al. 2023))
Dataset Splits No The paper mentions 'various budget constraints {1%, 5%, 10%, 20%}' for the seed set size, which is not related to dataset splits. It also states 'Additional details on dataset statistics, experimental setups, and model hyperparameters are in the appendix,' implying that specific dataset split information is not provided in the main text.
Hardware Specification No The paper does not provide any specific hardware details such as GPU models, CPU types, or memory amounts used for running experiments.
Software Dependencies No The paper does not provide specific software dependency details, such as library names with version numbers.
Experiment Setup No The paper states 'Additional details on dataset statistics, experimental setups, and model hyperparameters are in the appendix,' indicating that specific hyperparameter values or training configurations are not included in the main text.