On Multi-View Active Learning and the Combination with Semi-Supervised Learning
Abstract
Multi-view learning has become a hot topic during the past few years. In this paper, we first characterize the sample complexity of multi-view active learning . Under the α-expansion assumption, we get an exponential improvement in the sample complexity from usual Õ(1/ε) to Õ(log 1/ε), requiring neither strong assumption on data distribution such as the data is distributed uniformly over the unit sphere in ℜ d nor strong assumption on hypothesis class such as linear separators through the origin. We also give an upper bound of the error rate when the α-expansion assumption does not hold. Then, we analyze the combination of multi-view active learning and semi-supervised learning and get a further improvement in the sample complexity. Finally, we study the empirical behavior of the two paradigms, which verifies that the combination of multi-view active learning and semi-supervised learning is efficient.
Cite
Text
Wang and Zhou. "On Multi-View Active Learning and the Combination with Semi-Supervised Learning." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390301Markdown
[Wang and Zhou. "On Multi-View Active Learning and the Combination with Semi-Supervised Learning." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/wang2008icml-multi/) doi:10.1145/1390156.1390301BibTeX
@inproceedings{wang2008icml-multi,
title = {{On Multi-View Active Learning and the Combination with Semi-Supervised Learning}},
author = {Wang, Wei and Zhou, Zhi-Hua},
booktitle = {International Conference on Machine Learning},
year = {2008},
pages = {1152-1159},
doi = {10.1145/1390156.1390301},
url = {https://mlanthology.org/icml/2008/wang2008icml-multi/}
}