Variational (Gradient) Estimate of the Score Function in Energy-Based Latent Variable Models

Abstract

The learning and evaluation of energy-based latent variable models (EBLVMs) without any structural assumptions are highly challenging, because the true posteriors and the partition functions in such models are generally intractable. This paper presents variational estimates of the score function and its gradient with respect to the model parameters in a general EBLVM, referred to as VaES and VaGES respectively. The variational posterior is trained to minimize a certain divergence to the true model posterior and the bias in both estimates can be bounded by the divergence theoretically. With a minimal model assumption, VaES and VaGES can be applied to the kernelized Stein discrepancy (KSD) and score matching (SM)-based methods to learn EBLVMs. Besides, VaES can also be used to estimate the exact Fisher divergence between the data and general EBLVMs.

Cite

Text

Bao et al. "Variational (Gradient) Estimate of the Score Function in Energy-Based Latent Variable Models." NeurIPS 2020 Workshops: ICBINB, 2020.

Markdown

[Bao et al. "Variational (Gradient) Estimate of the Score Function in Energy-Based Latent Variable Models." NeurIPS 2020 Workshops: ICBINB, 2020.](https://mlanthology.org/neuripsw/2020/bao2020neuripsw-variational/)

BibTeX

@inproceedings{bao2020neuripsw-variational,
  title     = {{Variational (Gradient) Estimate of the Score Function in Energy-Based Latent Variable Models}},
  author    = {Bao, Fan and Xu, Kun and Li, Chongxuan and Hong, Lanqing and Zhu, Jun and Zhang, Bo},
  booktitle = {NeurIPS 2020 Workshops: ICBINB},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/bao2020neuripsw-variational/}
}