Generalization Error Bound for Denoising Score Matching Under Relaxed Manifold Assumption
Abstract
We examine theoretical properties of the denoising score matching estimate. We model the density of observations with a nonparametric Gaussian mixture. We significantly relax the standard manifold assumption allowing the samples step away from the manifold. At the same time, we are still able to leverage a nice distribution structure. We derive non-asymptotic bounds on the approximation and generalization errors of the denoising score matching estimate. The rates of convergence are determined by the intrinsic dimension. Furthermore, our bounds remain valid even if we allow the ambient dimension grow polynomially with the sample size.
Cite
Text
Yakovlev and Puchkin. "Generalization Error Bound for Denoising Score Matching Under Relaxed Manifold Assumption." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.Markdown
[Yakovlev and Puchkin. "Generalization Error Bound for Denoising Score Matching Under Relaxed Manifold Assumption." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.](https://mlanthology.org/colt/2025/yakovlev2025colt-generalization/)BibTeX
@inproceedings{yakovlev2025colt-generalization,
title = {{Generalization Error Bound for Denoising Score Matching Under Relaxed Manifold Assumption}},
author = {Yakovlev, Konstantin and Puchkin, Nikita},
booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory},
year = {2025},
pages = {5824-5891},
volume = {291},
url = {https://mlanthology.org/colt/2025/yakovlev2025colt-generalization/}
}