Learning-Efficient yet Generalizable Collaborative Filtering for Item Recommendation

Abstract

The weighted squared loss is a common component in several Collaborative Filtering (CF) algorithms for item recommendation, including the representative implicit Alternating Least Squares (iALS). Despite its widespread use, this loss function lacks a clear connection to ranking objectives such as Discounted Cumulative Gain (DCG), posing a fundamental challenge in explaining the exceptional ranking performance observed in these algorithms. In this work, we make a breakthrough by establishing a connection between squared loss and ranking metrics through a Taylor expansion of the DCG-consistent surrogate loss—softmax loss. We also discover a new surrogate squared loss function, namely Ranking-Generalizable Squared (RG$^2$) loss, and conduct thorough theoretical analyses on the DCG-consistency of the proposed loss function. Later, we present an example of utilizing the RG$^2$ loss with Matrix Factorization (MF), coupled with a generalization upper bound and an ALS optimization algorithm that leverages closed-form solutions over all items. Experimental results over three public datasets demonstrate the effectiveness of the RG$^2$ loss, exhibiting ranking performance on par with, or even surpassing, the softmax loss while achieving faster convergence.

Cite

Text

Pu et al. "Learning-Efficient yet Generalizable Collaborative Filtering for Item Recommendation." International Conference on Machine Learning, 2024.

Markdown

[Pu et al. "Learning-Efficient yet Generalizable Collaborative Filtering for Item Recommendation." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/pu2024icml-learningefficient/)

BibTeX

@inproceedings{pu2024icml-learningefficient,
  title     = {{Learning-Efficient yet Generalizable Collaborative Filtering for Item Recommendation}},
  author    = {Pu, Yuanhao and Chen, Xiaolong and Huang, Xu and Chen, Jin and Lian, Defu and Chen, Enhong},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {41183-41203},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/pu2024icml-learningefficient/}
}