Bridge Diffusion Model: Bridge Chinese Text-to-Image Diffusion Model with English Communities
Authors: Shanyuan Liu, Bo Cheng, Yuhang Ma, Liebucha Wu, Ao Ma, Xiaoyu Wu, Dawei Leng, Yuhui Yin
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | This section offers an overview of the experimental setup and showcases the effectiveness of BDM through both qualitative and quantitative demonstrations. Quantitative Evaluation. Human Evaluation on BDM-870. Evaluation on COCO. Chinese cultural inclination. Training data scale. Qualitative Results. Ablation Study. |
| Researcher Affiliation | Industry | Shanyuan Liu, Bo Cheng, Yuhang Ma, Liebucha Wu Ao Ma, Xiaoyu Wu, Dawei Leng*, Yuhui Yin 360 AI Research EMAIL EMAIL |
| Pseudocode | No | The paper describes the methodology using textual explanations and mathematical equations, but it does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code https: //github.com/360CVGroup/Bridge Diffusion Model |
| Open Datasets | Yes | commonly used LAION dataset(Schuhmann et al. 2022). Evaluation on COCO. We collected 870 diverse Chinese prompts from real users as benchmark for human evaluation of BDM model, and named it BDM-870. ... more detailed data content is available in this link1. https://github.com/360CVGroup/Bridge Diffusion Model |
| Dataset Splits | Yes | We randomly selected 30,000 images from the validation set for assessment and translate the English captions into Chinese automatically. |
| Hardware Specification | Yes | The training process spans two months on 80 NVIDIA A800 GPUs. |
| Software Dependencies | No | The entire model is built using Py Torch and we use the Adam W(Loshchilov and Hutter 2019) optimizer for training. Specific version numbers for PyTorch or AdamW are not provided. |
| Experiment Setup | Yes | we use the Adam W(Loshchilov and Hutter 2019) optimizer for training, setting a learning rate of 1e-5 and a batch size of 3200. |