Nearest Neighbour Score Estimators for Diffusion Generative Models

Abstract

Score function estimation is the cornerstone of both training and sampling from diffusion generative models. Despite this fact, the most commonly used estimators are either biased neural network approximations or high variance Monte Carlo estimators based on the conditional score. We introduce a novel nearest neighbour score function estimator which utilizes multiple samples from the training set to dramatically decrease estimator variance. We leverage our low variance estimator in two compelling applications. Training consistency models with our estimator, we report a significant increase in both convergence speed and sample quality. In diffusion models, we show that our estimator can replace a learned network for probability-flow ODE integration, opening promising new avenues of future research. Code will be released upon paper acceptance.

Cite

Text

Niedoba et al. "Nearest Neighbour Score Estimators for Diffusion Generative Models." International Conference on Machine Learning, 2024.

Markdown

[Niedoba et al. "Nearest Neighbour Score Estimators for Diffusion Generative Models." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/niedoba2024icml-nearest/)

BibTeX

@inproceedings{niedoba2024icml-nearest,
  title     = {{Nearest Neighbour Score Estimators for Diffusion Generative Models}},
  author    = {Niedoba, Matthew and Green, Dylan and Naderiparizi, Saeid and Lioutas, Vasileios and Lavington, Jonathan Wilder and Liang, Xiaoxuan and Liu, Yunpeng and Zhang, Ke and Dabiri, Setareh and Scibior, Adam and Zwartsenberg, Berend and Wood, Frank},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {38117-38144},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/niedoba2024icml-nearest/}
}