Training Discrete EBMs with Energy Discrepancy

Abstract

Training energy-based models (EBMs) on discrete spaces is challenging because sampling over such spaces can be difficult. We propose to train discrete EBMs with energy discrepancy (ED), a novel type of contrastive loss functional which only requires the evaluation of the energy function at data points and their perturbed counter parts, thus not relying on sampling strategies like Markov chain Monte Carlo (MCMC). Energy discrepancy offers theoretical guarantees for a broad class of perturbation processes of which we investigate three types: perturbations based on Bernoulli noise, based on deterministic transforms, and based on neighbourhood structures. We demonstrate their relative performance on lattice Ising models, binary synthetic data, and discrete image data sets.

Cite

Text

Schröder et al. "Training Discrete EBMs with Energy Discrepancy." ICML 2023 Workshops: SODS, 2023.

Markdown

[Schröder et al. "Training Discrete EBMs with Energy Discrepancy." ICML 2023 Workshops: SODS, 2023.](https://mlanthology.org/icmlw/2023/schroder2023icmlw-training/)

BibTeX

@inproceedings{schroder2023icmlw-training,
  title     = {{Training Discrete EBMs with Energy Discrepancy}},
  author    = {Schröder, Tobias and Ou, Zijing and Li, Yingzhen and Duncan, Andrew B.},
  booktitle = {ICML 2023 Workshops: SODS},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/schroder2023icmlw-training/}
}