Tighter and Convex Maximum Margin Clustering

Abstract

Maximum margin principle has been successfully applied to many supervised and semi-supervised problems in machine learning. Recently, this principle was extended for clustering, referred to as Maximum Margin Clustering (MMC) and achieved promising performance in recent studies. To avoid the problem of local minima, MMC can be solved globally via convex semi-definite programming (SDP) relaxation. Although many efficient approaches have been proposed to alleviate the computational burden of SDP, convex MMCs are still not scalable for medium data sets. In this paper, we propose a novel convex optimization method, LG-MMC, which maximizes the margin of opposite clusters via label generation. It can be shown that LG-MMC is much more scalable than existing convex approaches. Moreover, we show that our convex relaxation is tighter than state-of-art convex MMCs. Experiments on eighteen UCI datasets and MNIST dataset show significant improvement over existing MMC algorithms.

Cite

Text

Li et al. "Tighter and Convex Maximum Margin Clustering." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.

Markdown

[Li et al. "Tighter and Convex Maximum Margin Clustering." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.](https://mlanthology.org/aistats/2009/li2009aistats-tighter/)

BibTeX

@inproceedings{li2009aistats-tighter,
  title     = {{Tighter and Convex Maximum Margin Clustering}},
  author    = {Li, Yu-Feng and Tsang, Ivor W. and Kwok, Jame and Zhou, Zhi-Hua},
  booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics},
  year      = {2009},
  pages     = {344-351},
  volume    = {5},
  url       = {https://mlanthology.org/aistats/2009/li2009aistats-tighter/}
}