Just-in-Time Learning for Fast and Flexible Inference

Abstract

Much of research in machine learning has centered around the search for inference algorithms that are both general-purpose and efficient. The problem is extremely challenging and general inference remains computationally expensive. We seek to address this problem by observing that in most specific applications of a model, we typically only need to perform a small subset of all possible inference computations. Motivated by this, we introduce just-in-time learning, a framework for fast and flexible inference that learns to speed up inference at run-time. Through a series of experiments, we show how this framework can allow us to combine the flexibility of sampling with the efficiency of deterministic message-passing.

Cite

Text

Eslami et al. "Just-in-Time Learning for Fast and Flexible Inference." Neural Information Processing Systems, 2014.

Markdown

[Eslami et al. "Just-in-Time Learning for Fast and Flexible Inference." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/eslami2014neurips-justintime/)

BibTeX

@inproceedings{eslami2014neurips-justintime,
  title     = {{Just-in-Time Learning for Fast and Flexible Inference}},
  author    = {Eslami, S. M. Ali and Tarlow, Daniel and Kohli, Pushmeet and Winn, John},
  booktitle = {Neural Information Processing Systems},
  year      = {2014},
  pages     = {154-162},
  url       = {https://mlanthology.org/neurips/2014/eslami2014neurips-justintime/}
}