Markov Beta Processes for Time Evolving Dictionary Learning

Abstract

We develop Markov beta processes (MBP) asa model suitable for data which can be representedby a sparse set of latent featureswhich evolve over time. Most time evolvingnonparametric latent feature models inthe literature vary feature usage, but maintaina constant set of features over time. Weshow that being able to model features whichthemselves evolve over time results in theMBP outperforming other beta process basedmodels. Our construction utilizes Poissonprocess operations, which leave each transformedbeta process marginally beta processdistributed. This allows one to analyticallymarginalize out latent beta processes, exploitingconjugacy when we couple them with Bernoulli processes, leading to a surprisinglyelegant Gibbs MCMC scheme considering theexpressiveness of the prior. We apply themodel to the task of denoising and interpolatingnoisy image sequences and in predictingtime evolving gene expression data, demonstratingsuperior performance to other betaprocess based methods.

Cite

Text

Shah and Ghahramani. "Markov Beta Processes for Time Evolving Dictionary Learning." Conference on Uncertainty in Artificial Intelligence, 2016.

Markdown

[Shah and Ghahramani. "Markov Beta Processes for Time Evolving Dictionary Learning." Conference on Uncertainty in Artificial Intelligence, 2016.](https://mlanthology.org/uai/2016/shah2016uai-markov/)

BibTeX

@inproceedings{shah2016uai-markov,
  title     = {{Markov Beta Processes for Time Evolving Dictionary Learning}},
  author    = {Shah, Amar and Ghahramani, Zoubin},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2016},
  url       = {https://mlanthology.org/uai/2016/shah2016uai-markov/}
}