Learning Latent Graph Structures and their Uncertainty

Authors: Alessandro Manenti, Daniele Zambon, Cesare Alippi

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results validate our theoretical claims and demonstrate the effectiveness of the proposed approach. ... 6. Experiments
Researcher Affiliation Academia 1Universit a della Svizzera italiana, IDSIA, Lugano, Switzerland 2Politecnico di Milano, Milan, Italy.
Pseudocode No The paper describes methods and algorithms but does not provide any explicitly labeled pseudocode or algorithm blocks with structured formatting.
Open Source Code Yes 1Code available at https://github.com/allemanenti/Learning-Calibrated-Structures
Open Datasets Yes To demonstrate that our method learns meaningful graph distributions in real-world settings, we train a neural network on air quality data in Beijing (Zheng et al., 2013).
Dataset Splits Yes We result in a dataset of 35k input-output pairs (x, y), 80% of which are used as training set, 10% as validation set, and the remaining 10% as test set.
Hardware Specification Yes The paper s experiments were run on a workstation with AMD EPYC 7513 processors and NVIDIA RTX A5000 GPUs; on average, a single model training terminates in a few minutes with a memory usage of about 1GB.
Software Dependencies No The developed code relies on Py Torch (Paszke et al., 2019) and the following additional open-source libraries: Py Torch Geometric (Fey & Lenssen, 2019), Num Py (Harris et al., 2020) and Matplotlib (Hunter, 2007). While these libraries are mentioned, specific version numbers (e.g., PyTorch 1.9) are not provided.
Experiment Setup Yes The model is trained using Adam optimizer (Kingma & Ba, 2014) with parameters β1 = 0.9, β2 = 0.99. Where not specified, the learning rate is set to 0.05 and decreased to 0.01 after 5 epochs. We grouped data points into batches of size 128. Initial values of θ are independently sampled from the U(0.0, 0.1) uniform distribution.