Representation Learning in Continuous-Time Score-Based Generative Models
Abstract
Score-based methods represented as stochastic differential equations on a continuous time domain have recently proven successful as a non-adversarial generative model. Training such models relies on denoising score matching, which can be seen as multi-scale denoising autoencoders. Here, we augment the denoising score-matching framework to enable representation learning without any supervised signal. GANs and VAEs learn representations by directly transforming latent codes to data samples. In contrast, score-based representation learning relies on a new formulation of the denoising score-matching objective and thus encodes information needed for denoising. We show how this difference allows for manual control of the level of detail encoded in the representation.
Cite
Text
Abstreiter et al. "Representation Learning in Continuous-Time Score-Based Generative Models." ICML 2021 Workshops: INNF, 2021.Markdown
[Abstreiter et al. "Representation Learning in Continuous-Time Score-Based Generative Models." ICML 2021 Workshops: INNF, 2021.](https://mlanthology.org/icmlw/2021/abstreiter2021icmlw-representation/)BibTeX
@inproceedings{abstreiter2021icmlw-representation,
title = {{Representation Learning in Continuous-Time Score-Based Generative Models}},
author = {Abstreiter, Korbinian and Bauer, Stefan and Mehrjou, Arash},
booktitle = {ICML 2021 Workshops: INNF},
year = {2021},
url = {https://mlanthology.org/icmlw/2021/abstreiter2021icmlw-representation/}
}