Pre-training Auto-regressive Robotic Models with 4D Representations
Authors: Dantong Niu, Yuvan Sharma, Haoru Xue, Giscard Biamby, Junyi Zhang, Ziteng Ji, Trevor Darrell, Roei Herzig
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments show that ARM4R can transfer efficiently from human video data to robotics and consistently improves performance on tasks across various robot environments and configurations. |
| Researcher Affiliation | Academia | 1Berkeley AI Research, UC Berkeley. Correspondence to: Dantong Niu <bias EMAIL>. |
| Pseudocode | No | The paper describes the methodology in prose and through diagrams (Figure 2), but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper provides a project website (https://arm4r.github.io/) but does not include an unambiguous statement that the source code for the methodology described in this paper is publicly available, nor does it provide a direct link to a code repository. |
| Open Datasets | Yes | Specifically, we train our model on 76K videos from the Epic-Kitchens100 dataset (Damen et al., 2018)... We evaluate ARM4R on 12 RLBench tasks... Large-scale datasets such as Open X (Collaboration et al., 2023)... |
| Dataset Splits | Yes | We train ARM4R for each task using 190 successful demos for every variation of the task (for more details, see Appendix D), and evaluate using 25 episodes per task in the validation set. |
| Hardware Specification | Yes | Finally, we use 4 NVIDIA A6000 GPUs for training and a single NVIDIA A6000 GPU for evaluation. |
| Software Dependencies | No | ARM4R is implemented using Py Torch (Paszke et al., 2019). |
| Experiment Setup | Yes | We used the following hyperparameters for the three stages of training: Table 7: Training Hyperparameters for the three stages. Hyperparameter Stage 1 Stage 2 Stage 3 Learning Rate 5 10 4 5 10 4 5 10 3 Weight Decay 1 10 2 1 10 2 1 10 2 Batch Size 256 256 256 Number of Epochs 5 20 10-50 |