DiffRetouch: Using Diffusion to Retouch on the Shoulder of Experts
Authors: Zheng-Peng Duan, Jiawei Zhang, Zheng Lin, Xin Jin, XunDong Wang, Dongqing Zou, Chun-Le Guo, Chongyi Li
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments demonstrate the performance of our method on visually appealing and sample diversity. Quantitative comparison on MIT-Adobe Five K dataset with subsets retouched by five experts (A/B/C/D/E). |
| Researcher Affiliation | Collaboration | 1VCIP, CS, Nankai University 2Sense Time Research 3BNRist, Department of Computer Science and Technology, Tsinghua University 4PBVR 5Wuhan University of Technology 6NKIARI, Shenzhen Futian |
| Pseudocode | No | The paper describes the methodology in narrative text and with diagrams, but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Project https://adam-duan.github.io/projects/retouch/ |
| Open Datasets | Yes | Our experiments are conducted on the MIT-Adobe Five K dataset (Bychkovsky et al. 2011) and the PPR10K dataset (Liang et al. 2021). |
| Dataset Splits | Yes | We follow the pre-processing pipeline in (Song, Qian, and Du 2021; Wang et al. 2019), and split the dataset into 4,500 pairs for training and 500 pairs for validation, which is also known as MIT-Adobe-5K-UPE. For both datasets, we construct the image-condition pairs for each image following the practice introduced above. We divide the PPR10K dataset into a training set with 1,356 groups and 8,875 photos, and a testing set with 325 groups and 2,286 photos. |
| Hardware Specification | No | The computational devices of this work is supported by the Supercomputing Center of Nankai University (NKSC). No specific hardware details such as GPU models, CPU models, or memory specifications are provided. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software components or libraries used in the experiments. |
| Experiment Setup | No | The paper does not provide specific experimental setup details such as hyperparameter values (e.g., learning rate, batch size, number of epochs), optimizer settings, or model initialization details. |