ABNet: Mitigating Sample Imbalance in Anomaly Detection Within Dynamic Graphs

Authors: Yifan Hong, Muhammad Asif Ali, Huan Wang, Junyang Chen, Di Wang

IJCAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on three real-world datasets demonstrate that ABNet consistently outperforms existing methods and effectively addresses the sample imbalance problem.
Researcher Affiliation Academia 1Key Laboratory of Smart Farming for Agricultural Animals, Engineering Research Center of Intelligent Technology for Agriculture, Ministry of Education, and College of Informatics, Huazhong Agricultural University 2Division of CEMSE, King Abdullah University of Science and Technology 3College of Computer Science and Software Engineering, Shenzhen University
Pseudocode Yes Algorithm 1 ABNet training process-flow.
Open Source Code Yes 1The code is available at an anonymous repository: https:// anonymous.4open.science/r/ABNet sample-F626.
Open Datasets Yes We evaluate our approach on three real-world dynamic graph datasets [Kumar et al., 2019], summarized in Table 2: Wikipedia: This data captures user edit events, with labels indicating if a user was blocked. Reddit: This data captures user activity in subreddits, labeled by whether a user was banned. Mooc: This data captures student interactions on online learning platforms, labeled by course dropout status.
Dataset Splits Yes We split each dataset into five temporal segments; the last segment is used for testing (75% for test, remainder for training/validation).
Hardware Specification Yes Experiments run on an Intel Xeon Gold 6132 CPU, 64GB RAM, and NVIDIA A100 GPU, with results averaged over 20 runs.
Software Dependencies No The paper does not explicitly provide specific versions for software dependencies like programming languages (e.g., Python), libraries (e.g., PyTorch, TensorFlow), or other relevant tools beyond mentioning the 'Adam optimizer' and citing a paper for it.
Experiment Setup Yes Node features have dimension k = 128, with a batch size of 100. The anomaly augmenter uses a 5-layer autoencoder, and training employs an improved Adam optimizer [Bock and Weiß, 2019].