Continuous Semi-Implicit Models

Authors: Longlin Yu, Jiajun Zha, Tong Yang, Tianyu Xie, Xiangyu Zhang, S.-H. Chan, Cheng Zhang

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on image generation demonstrate that Co SIM performs on par or better than existing diffusion model acceleration methods, achieving superior performance on FD-DINOv2. Extensive experiments on image generation demonstrate that Co SIM performs on par or better than existing diffusion model acceleration methods, achieving superior performance on FD-DINOv2. ... 4. Experiments ... Table 1. Unconditional generation quality on CIFAR-10 (32 32). ... Table 3. Conditional generation quality on Image Net (64 64).
Researcher Affiliation Collaboration 1School of Mathematical Sciences, Peking University, Beijing, China 2Department of Computer Science and Engineering, Hong Kong University of Science and Technology 3School of Computer Science, Fudan University, Shanghai, China 4Megvii Technology Inc., Beijing, China 5Center for Statistical Science, Peking University, Beijing, China. Correspondence to: Cheng Zhang <EMAIL>.
Pseudocode Yes Algorithm 1 Inference of Continuous Semi-Implicit Models ... Algorithm 2 Training procedure of Continuous Semi-Implicit Models (Co SIM)
Open Source Code Yes The implementation of Co SIM is available at https://github.com/longin Yu/Co SIM.
Open Datasets Yes Both FID and FD-DINOv2 are evaluated on 50K generated images and the whole training set, which means 50K images from CIFAR10 (Krizhevsky & Hinton, 2009) training split and 1,281,167 images from Image Net (Deng et al., 2009).
Dataset Splits Yes Both FID and FD-DINOv2 are evaluated on 50K generated images and the whole training set, which means 50K images from CIFAR10 (Krizhevsky & Hinton, 2009) training split and 1,281,167 images from Image Net (Deng et al., 2009). ... Table 1. Unconditional generation quality on CIFAR-10 (32 32). Results with asterisks ( ) are tested by ourselves with the official codes and checkpoints. ... CIFAR:Test Split 3.15 31.07
Hardware Specification Yes We conducted all of our experiments using 8 NVIDIA L40S GPU with 48GB video memory.
Software Dependencies No The paper does not provide specific software names with version numbers for dependencies.
Experiment Setup Yes The hyperparameters for all of our experiments are presented in Table 5. ... Table 5. Hyperparameters for different experimental setup. Batch Size 2048 2048 2048 2048 2048 Batch per GPU 64 16 32 32 32 Gradient accumulation round 4 16 8 8 8 # of GPU (L40S 48G) 8 8 8 8 8 Learning rate of Gϕ and fψ 1e 5 4e 6 2e 5 2e 5 1e 4 # of EMA half-life images 0.5M 2M 2M 2M 2M Optimizer Adam eps 1e 8 1e 12 1e 12 1e 12 1e 12 Optimizer Adam β1 0.0 0.0 0.0 0.0 0.0 Optimizer Adam β2 0.999 0.99 0.99 0.99 0.99 R 4 8 8 4 4 # of total training images 200M 200M 200M 200M 20M # of parameters 56M 296M 280M 498M 778M dropout 0 0.1 0 0.1 0.1 augment 0 0 0 0 0