Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models

Abstract

We develop a sequential low-complexity inference procedure for Dirichlet process mixtures of Gaussians for online clustering and parameter estimation when the number of clusters are unknown a-priori. We present an easily computable, closed form parametric expression for the conditional likelihood, in which hyperparameters are recursively updated as a function of the streaming data assuming conjugate priors. Motivated by large-sample asymptotics, we propose a noveladaptive low-complexity design for the Dirichlet process concentration parameter and show that the number of classes grow at most at a logarithmic rate. We further prove that in the large-sample limit, the conditional likelihood and datapredictive distribution become asymptotically Gaussian. We demonstrate through experiments on synthetic and real data sets that our approach is superior to otheronline state-of-the-art methods.

Cite

Text

Tsiligkaridis et al. "Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models." Neural Information Processing Systems, 2015.

Markdown

[Tsiligkaridis et al. "Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/tsiligkaridis2015neurips-adaptive/)

BibTeX

@inproceedings{tsiligkaridis2015neurips-adaptive,
  title     = {{Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models}},
  author    = {Tsiligkaridis, Theodoros and Tsiligkaridis, Theodoros and Forsythe, Keith},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {28-36},
  url       = {https://mlanthology.org/neurips/2015/tsiligkaridis2015neurips-adaptive/}
}