AdaO2B: Adaptive Online to Batch Conversion for Out-of-Distribution Generalization
Authors: Xiao Zhang, Sunhao Dai, Jun Xu, Yong Liu, Zhenhua Dong
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results have demonstrated that Ada O2B significantly outperforms state-of-the-art baselines on both synthetic and real-world recommendation datasets. |
| Researcher Affiliation | Collaboration | Xiao Zhang,1 Sunhao Dai,1 Jun Xu,1,* Yong Liu,1 Zhenhua Dong2 1 Gaoling School of Artificial Intelligence, Renmin University of China, Beijing, China 2 Huawei Noah s Ark Lab, Shenzhen, China EMAIL EMAIL |
| Pseudocode | Yes | Algorithm 1: Ada O2B |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code for the methodology described. It mentions a dataset website. |
| Open Datasets | Yes | We used the Kuai Rec dataset3, which provides a fully observed user-item interaction matrix from the popular videosharing app Kuaishou. 3https://kuairec.com |
| Dataset Splits | No | For both synthetic and real-world datasets, we split them into two subsets for the online learning phase (as well as the O2B conversion phase) and the batch testing phase, respectively, denoted by OL-Data and BT-Data. |
| Hardware Specification | No | The paper does not explicitly describe the hardware used for its experiments, such as specific GPU/CPU models or memory details. |
| Software Dependencies | No | The paper mentions 'Adam (Kingma and Ba 2014) is used to conduct the optimization' but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | We trained Ada O2B based on the last 10 (i.e.,, K = 10) data buffers and history policies. We tuned the hyper-parameters as follows: the learning rate was tuned within the range of {1e 2, 1e 3, 1e 4, 1e 5}, the weight decay was tuned among {1e 3, 1e 4, 1e 5, 1e 6}, and the batch size was tuned in {256, 512, 1024, 2048}. |