TMetaNet: Topological Meta-Learning Framework for Dynamic Link Prediction

Authors: Hao Li, Hao Wan, Yuzhou Chen, Dongsheng Ye, Yulia Gel, Hao Jiang

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on realworld datasets demonstrate TMeta Net s state-of-the-art performance and resilience to graph noise, illustrating its high potential for meta-learning and dynamic graph analysis. Section 6: Experiments
Researcher Affiliation Academia 1Wuhan University, Wuhan, China 2University of California, Riverside, USA 3Hubei University of Automotive Technology, Shiyan, China 4Virginia Tech, Blacksburg, USA. Correspondence to: Hao Jiang <EMAIL>.
Pseudocode Yes Algorithm 1 Construct ε-net for Snapshots and Algorithm 2 Construct ε-nets for Discrete-Time Dynamic Graphs in Appendix C.1.
Open Source Code Yes Our code is available at https://github.com/Lihaogx/ TMeta Net.
Open Datasets Yes We conduct experiments on six public datasets, which are widely used benchmarks for evaluating the performance of dynamic link prediction, i.e., (1) Bitcoin-OTC (OTC) and Bitcoin-Alpha (Alpha) are trust networks from transactions on different Bitcoin platforms (Kumar et al., 2018b; 2016). (2) Reddit-Body (Body) and Reddit-Title (Title) datasets are from the REDDIT platform, representing hyperlink networks in post titles and bodies, respectively (Kumar et al., 2018a). (3) UCI-Message (UCI) consists of private messages between users (Panzarasa et al., 2009). (4) ETH-Yocoin is derived from the Yocoin transaction network on Ethereum blocks (Li et al., 2020). Detailed dataset information can be found in the Appendix D.1.
Dataset Splits Yes ROLAND s live update training strategy divides each snapshot into training, validation, and test sets, and evaluates the model on each time slice. ... Win GNN, on the other hand, arranges snapshots in chronological order, with the first 70% as the training set and the last 30% as the test set. Appendix C.3 provides an illustration of different task splitting strategies.
Hardware Specification Yes All experiments in this paper were conducted on a Linux server with 4 NVIDIA A6000 GPUs.
Software Dependencies No The paper mentions common machine learning frameworks and libraries (e.g., GNNs, meta-learning) but does not provide specific version numbers for any software components or libraries.
Experiment Setup Yes For TMeta Net, the core parameters include the following parts: Parameters for computing the Dowker Persistence Diagram. We choose ε = 1 and δ = 1, with the window size set to full. Parameters for computing the Zigzag Persistence Image. We set the image size to be 50. Parameters for the TMeta Net meta-learning model, mainly including the meta-learning parameter update model s learning rate meta lr and dropout rate. We use the grid search to select the optimal parameter combinations for each dataset.