A Unified View of Matrix Factorization Models
Abstract
We present a unified view of matrix factorization that frames the differences among popular methods, such as NMF, Weighted SVD, E-PCA, MMMF, pLSI, pLSI-pHITS, Bregman co-clustering, and many others, in terms of a small number of modeling choices. Many of these approaches can be viewed as minimizing a generalized Bregman divergence, and we show that (i) a straightforward alternating projection algorithm can be applied to almost any model in our unified view; (ii) the Hessian for each projection has special structure that makes a Newton projection feasible, even when there are equality constraints on the factors, which allows for matrix co-clustering; and (iii) alternating projections can be generalized to simultaneously factor a set of matrices that share dimensions. These observations immediately yield new optimization algorithms for the above factorization methods, and suggest novel generalizations of these methods such as incorporating row and column biases, and adding or relaxing clustering constraints.
Cite
Text
Singh and Gordon. "A Unified View of Matrix Factorization Models." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2008. doi:10.1007/978-3-540-87481-2_24Markdown
[Singh and Gordon. "A Unified View of Matrix Factorization Models." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2008.](https://mlanthology.org/ecmlpkdd/2008/singh2008ecmlpkdd-unified/) doi:10.1007/978-3-540-87481-2_24BibTeX
@inproceedings{singh2008ecmlpkdd-unified,
title = {{A Unified View of Matrix Factorization Models}},
author = {Singh, Ajit Paul and Gordon, Geoffrey J.},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2008},
pages = {358-373},
doi = {10.1007/978-3-540-87481-2_24},
url = {https://mlanthology.org/ecmlpkdd/2008/singh2008ecmlpkdd-unified/}
}