Paper Image

Accelerating diffusion models with distillation for fast high-quality image generation

Published on:

8 May 2024

Primary Category:

Computer Vision and Pattern Recognition

Paper Authors:

Jonas Kohler,

Albert Pumarola,

Edgar Schönfeld,

Artsiom Sanakoyeu,

Roshan Sumbaly,

Peter Vajda,

Ali Thabet

Bullets

Key Details

Proposes distillation framework tailored for extremely low-step diffusion model inference

Introduces Backward Distillation concept to calibrate student on its own backward path

Presents Shifted Reconstruction Loss to transfer hierarchical knowledge

Applies Noise Correction to address singularities in early sampling

Achieves teacher-level performance in just 3 steps

AI generated summary

Accelerating diffusion models with distillation for fast high-quality image generation

This paper proposes a distillation framework to accelerate diffusion models, enabling high-quality and diverse image generation using only 1-3 sampling steps. Key innovations include Backward Distillation to reduce train-test discrepancy, Shifted Reconstruction Loss to transfer both structure and detail knowledge, and Noise Correction to enhance initial sample quality.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up