Relative Loss Bounds for On-Line Density Estirnation with the Exponential Family of Distributions

Abstract

We consider on-line density estimation with a parameterized density from the exponential family. The on-line algorithm receives one example at a time and maintains a parameter that is essentially an average of the past examples. After receiving an example the algorithm incurs a loss which is the negative log-likelihood of the example w.r.t. the past parameter of the algorithm. An off-line algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the on-line algorithm over the total loss of the off-line algorithm. These relative loss bounds hold for an arbitrary sequence of examples. The goal is to design algorithms with the best possible relative loss bounds. We use a certain divergence to derive and analyze the algorithms. This divergence is a relative entropy between two exponential distributions.

Cite

Text

Azoury and Warmuth. "Relative Loss Bounds for On-Line Density Estirnation with the Exponential Family of Distributions." Conference on Uncertainty in Artificial Intelligence, 1999.

Markdown

[Azoury and Warmuth. "Relative Loss Bounds for On-Line Density Estirnation with the Exponential Family of Distributions." Conference on Uncertainty in Artificial Intelligence, 1999.](https://mlanthology.org/uai/1999/azoury1999uai-relative/)

BibTeX

@inproceedings{azoury1999uai-relative,
  title     = {{Relative Loss Bounds for On-Line Density Estirnation with the Exponential Family of Distributions}},
  author    = {Azoury, Katy S. and Warmuth, Manfred K.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {1999},
  pages     = {31-40},
  url       = {https://mlanthology.org/uai/1999/azoury1999uai-relative/}
}