Multi-View Self-Paced Learning for Clustering
Abstract
Exploiting the information from multiple views can improve clustering accuracy. However, most existing multi-view clustering algorithms are non-convex and are thus prone to becoming stuck into bad local minima, especially when there are outliers and missing data. To overcome this problem, we present a new multi-view self-paced learning (MSPL) algorithm for clustering, that learns the multi-view model by not only progressing from 'easy' to 'complex' examples, but also from 'easy' to 'complex' views. Instead of binarily separating the examples or views into 'easy' and 'complex', we design a novel probabilistic smoothed weighting scheme. Employing multiple views for clustering and defining complexity across both examples and views are shown theoretically to be beneficial to optimal clustering. Experimental results on toy and real-world data demonstrate the efficacy of the proposed algorithm.
Cite
Text
Xu et al. "Multi-View Self-Paced Learning for Clustering." International Joint Conference on Artificial Intelligence, 2015.Markdown
[Xu et al. "Multi-View Self-Paced Learning for Clustering." International Joint Conference on Artificial Intelligence, 2015.](https://mlanthology.org/ijcai/2015/xu2015ijcai-multi/)BibTeX
@inproceedings{xu2015ijcai-multi,
title = {{Multi-View Self-Paced Learning for Clustering}},
author = {Xu, Chang and Tao, Dacheng and Xu, Chao},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2015},
pages = {3974-3980},
url = {https://mlanthology.org/ijcai/2015/xu2015ijcai-multi/}
}