Poster
Towards a mathematical theory for consistency training in diffusion models
Danqi Liao · Manit Paul
[
Abstract
]
Abstract:
Consistency models, which were proposed to mitigate the high computational overhead during thesampling phase of diffusion models, facilitate single-step sampling while attaining state-of-the-art empiricalperformance. When integrated into the training phase, consistency models attempt to train a sequenceof consistency functions capable of mapping any point at any time step of the diffusion process to itsstarting point. Despite the empirical success, a comprehensive theoretical understanding of consistencytraining remains elusive. This paper takes a first step towards establishing theoretical underpinnings forconsistency models. We demonstrate that, in order to generate samples within proximity to the targetin distribution (measured by some Wasserstein metric), it suffices for the number of steps in consistencylearning to exceed the order of , with the data dimension. Our theory offers rigorous insights intothe validity and efficacy of consistency models, illuminating their utility in downstream inference tasks.
Live content is unavailable. Log in and register to view live content