Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning

Abstract

Optimization problems with an auxiliary latent variable structure in addition to the main model parameters occur frequently in computer vision and machine learning. The additional latent variables make the underlying optimization task expensive, either in terms of memory (by maintaining the latent variables), or in terms of runtime (repeated exact inference of latent variables). We aim to remove the need to maintain the latent variables and propose two formally justified methods, that dynamically adapt the required accuracy of latent variable inference. These methods have applications in large scale robust estimation and in learning the parameters of an energy-based model from labeled data.

Cite

Text

Zach and Le. "Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58574-7_28

Markdown

[Zach and Le. "Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/zach2020eccv-truncated/) doi:10.1007/978-3-030-58574-7_28

BibTeX

@inproceedings{zach2020eccv-truncated,
  title     = {{Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning}},
  author    = {Zach, Christopher and Le, Huu},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2020},
  doi       = {10.1007/978-3-030-58574-7_28},
  url       = {https://mlanthology.org/eccv/2020/zach2020eccv-truncated/}
}