Treffer: An analysis of the noise schedule for score-based generative models

Title:
An analysis of the noise schedule for score-based generative models
Contributors:
Sorbonne Université (SU), Laboratoire de Probabilités, Statistique et Modélisation (LPSM (UMR_8001)), Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS)-Université Paris Cité (UPCité), Centre de Mathématiques Appliquées de l'Ecole polytechnique (CMAP), Institut National de Recherche en Informatique et en Automatique (Inria)-École polytechnique (X), Institut Polytechnique de Paris (IP Paris)-Institut Polytechnique de Paris (IP Paris)-Centre National de la Recherche Scientifique (CNRS), Institut universitaire de France (IUF), Ministère de l'Education nationale, de l’Enseignement supérieur et de la Recherche (M.E.N.E.S.R.)
Publisher Information:
CCSD, 2025.
Publication Year:
2025
Collection:
collection:X
collection:CNRS
collection:INSMI
collection:X-CMAP
collection:X-DEP-MATHA
collection:CMAP
collection:LPSM
collection:SORBONNE-UNIVERSITE
collection:SORBONNE-UNIV
collection:SU-SCIENCES
collection:IP_PARIS
collection:UNIV-PARIS
collection:UNIVERSITE-PARIS
collection:UP-SCIENCES
collection:SU-TI
collection:ALLIANCE-SU
collection:SUPRA_MATHS_INFO
collection:DEPARTEMENT-DE-MATHEMATIQUES
Original Identifier:
ARXIV: 2402.04650
HAL: hal-04441680
Document Type:
E-Ressource preprint<br />Preprints<br />Working Papers
Language:
English
Relation:
info:eu-repo/semantics/altIdentifier/arxiv/2402.04650
Rights:
info:eu-repo/semantics/OpenAccess
Accession Number:
edshal.hal.04441680v4
Database:
HAL

Weitere Informationen

Score-based generative models (SGMs) aim at estimating a target data distribution by learning score functions using only noise-perturbed samples from the target.Recent literature has focused extensively on assessing the error between the target and estimated distributions, gauging the generative quality through the Kullback-Leibler (KL) divergence and Wasserstein distances. Under mild assumptions on the data distribution, we establish an upper bound for the KL divergence between the target and the estimated distributions, explicitly depending on any time-dependent noise schedule. Under additional regularity assumptions, taking advantage of favorable underlying contraction mechanisms, we provide a tighter error bound in Wasserstein distance compared to state-of-the-art results. In addition to being tractable, this upper bound jointly incorporates properties of the target distribution and SGM hyperparameters that need to be tuned during training. Finally, we illustrate these bounds through numerical experiments using simulated and CIFAR-10 datasets, identifying an optimal range of noise schedules within a parametric family.