Fast Maximum Margin Matrix Factorization for Collaborative Prediction

Abstract

Maximum Margin Matrix Factorization (MMMF) was recently suggested (Srebro et al., 2005) as a convex, infinite dimensional alternative to low-rank approximations and standard factor models. MMMF can be formulated as a semi-definite programming (SDP) and learned using standard SDP solvers. However, current SDP solvers can only handle MMMF problems on matrices of dimensionality up to a few hundred. Here, we investigate a direct gradient-based optimization method for MMMF and demonstrate it on large collaborative prediction problems. We compare against results obtained by Marlin (2004) and find that MMMF substantially outperforms all nine methods he tested.

Cite

Text

Rennie and Srebro. "Fast Maximum Margin Matrix Factorization for Collaborative Prediction." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102441

Markdown

[Rennie and Srebro. "Fast Maximum Margin Matrix Factorization for Collaborative Prediction." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/rennie2005icml-fast/) doi:10.1145/1102351.1102441

BibTeX

@inproceedings{rennie2005icml-fast,
  title     = {{Fast Maximum Margin Matrix Factorization for Collaborative Prediction}},
  author    = {Rennie, Jason D. M. and Srebro, Nathan},
  booktitle = {International Conference on Machine Learning},
  year      = {2005},
  pages     = {713-719},
  doi       = {10.1145/1102351.1102441},
  url       = {https://mlanthology.org/icml/2005/rennie2005icml-fast/}
}