Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture

Abstract

This paper presents a novel algorithm, based upon the dependent Dirichlet process mixture model (DDPMM), for clustering batch-sequential data containing an unknown number of evolving clusters. The algorithm is derived via a low-variance asymptotic analysis of the Gibbs sampling algorithm for the DDPMM, and provides a hard clustering with convergence guarantees similar to those of the k-means algorithm. Empirical results from a synthetic test with moving Gaussian clusters and a test with real ADS-B aircraft trajectory data demonstrate that the algorithm requires orders of magnitude less computational time than contemporary probabilistic and hard clustering algorithms, while providing higher accuracy on the examined datasets.

Cite

Text

Campbell et al. "Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture." Neural Information Processing Systems, 2013.

Markdown

[Campbell et al. "Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/campbell2013neurips-dynamic/)

BibTeX

@inproceedings{campbell2013neurips-dynamic,
  title     = {{Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture}},
  author    = {Campbell, Trevor and Liu, Miao and Kulis, Brian and How, Jonathan P and Carin, Lawrence},
  booktitle = {Neural Information Processing Systems},
  year      = {2013},
  pages     = {449-457},
  url       = {https://mlanthology.org/neurips/2013/campbell2013neurips-dynamic/}
}