Fast High-Resolution Image Synthesis with Latent Adversarial Diffusion Distillation
Paper
•
2403.12015
•
Published
•
64
Techniques to speed up diffusion models
Note LADD, a distillation technique used by SD3 Turbo and now by Flux-schnell.
Note Idea is that the flow from noise to target distribution can be split into N windows and within each window we can linearise the flow. Paper uses N=4. Details better preserved than LCM-LoRA, denoising more than N at inference is possible and can be trained on a large real world dataset instead of synthetic data from teacher model.