Learning Mixtures of Plackett-Luce Models
Abstract
In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k≥2, the mixture of k Plackett-Luce models for no more than 2k-1 alternatives is non-identifiable and this bound is tight for k=2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is \em generically identifiable if k≤⌊\frac m-2 2⌋!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency.
Cite
Text
Zhao et al. "Learning Mixtures of Plackett-Luce Models." International Conference on Machine Learning, 2016.Markdown
[Zhao et al. "Learning Mixtures of Plackett-Luce Models." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/zhao2016icml-learning/)BibTeX
@inproceedings{zhao2016icml-learning,
title = {{Learning Mixtures of Plackett-Luce Models}},
author = {Zhao, Zhibing and Piech, Peter and Xia, Lirong},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {2906-2914},
volume = {48},
url = {https://mlanthology.org/icml/2016/zhao2016icml-learning/}
}