Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs
Abstract
Infinite SVM (iSVM) is a Dirichlet process (DP) mixture of large-margin classifiers. Though flexible in learning nonlinear classifiers and discovering latent clustering structures, iSVM has a difficult inference task and existing methods could hinder its applicability to large-scale problems. This paper presents a small-variance asymptotic analysis to derive a simple and efficient algorithm, which monotonically optimizes a max-margin DP-means (M2DPM) problem, an extension of DP-means for both predictive learning and descriptive clustering. Our analysis is built on Gibbs infinite SVMs, an alternative DP mixture of large-margin machines, which admits a partially collapsed Gibbs sampler without truncation by exploring data augmentation techniques. Experimental results show that M2DPM runs much faster than similar algorithms without sacrificing prediction accuracies.
Cite
Text
Wang and Zhu. "Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs." AAAI Conference on Artificial Intelligence, 2014. doi:10.1609/AAAI.V28I1.8959Markdown
[Wang and Zhu. "Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs." AAAI Conference on Artificial Intelligence, 2014.](https://mlanthology.org/aaai/2014/wang2014aaai-small/) doi:10.1609/AAAI.V28I1.8959BibTeX
@inproceedings{wang2014aaai-small,
title = {{Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs}},
author = {Wang, Yining and Zhu, Jun},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2014},
pages = {2135-2141},
doi = {10.1609/AAAI.V28I1.8959},
url = {https://mlanthology.org/aaai/2014/wang2014aaai-small/}
}