Disentangled Contrastive Bundle Recommendation with Conditional Diffusion

Authors: Jiuqiang Li

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experimental evaluations on three benchmark datasets reveal that DCBR significantly outperforms state-of-the-art methods.
Researcher Affiliation Academia 1School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, China 2Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education, China
Pseudocode No The paper describes the methodology using textual descriptions and mathematical formulations, but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Code https://github.com/recomall/DCBR
Open Datasets Yes The datasets utilized in the evaluation include Meal Rec+ H, Meal Rec+ L (Li et al. 2024), and i Fashion (Chen et al. 2019b), corresponding to meal and fashion outfit recommendation scenarios, respectively.
Dataset Splits Yes The data partitioning follows previous work (Ma et al. 2022; Li et al. 2024).
Hardware Specification Yes evaluated on an NVIDIA RTX 3090 GPU with 24GB of memory.
Software Dependencies No implemented using PyTorch (Paszke et al. 2019) - The paper mentions PyTorch but does not specify a version number.
Experiment Setup Yes optimized with Adam optimizer (Kingma and Ba 2015) at a learning rate of 1e 3, and evaluated on an NVIDIA RTX 3090 GPU with 24GB of memory. All models use Xavier initialization (Glorot and Bengio 2010) for their embeddings, with the embedding size fixed at 64 and the minibatch size set at 2048. The number of negative samples and the test interval are fixed at 1 and 5, respectively. For our DCBR, the number of graph propagation layers L is fixed at 2, λ2 is selected in {1e 5, 1e 6, 1e 7}, and the υx, ξ(l) x , ω, τ, γi [0; 1] are optimized through grid search. λ0 and λ1 are tuned from the ranges of {1e0, 1e1, 1e2, 1e3, 1e4} and {0.01, 0.02, 0.03, 0.04, 0.05, 0.2, 0.3, 0.4}, respectively.