Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks

Abstract

Factorization Machines (FMs) are a supervised learning approach that enhances the linear regression model by incorporating the second-order feature interactions. Despite effectiveness, FM can be hindered by its modelling of all feature interactions with the same weight, as not all feature interactions are equally useful and predictive. For example, the interactions with useless features may even introduce noises and adversely degrade the performance. In this work, we improve FM by discriminating the importance of different feature interactions. We propose a novel model named Attentional Factorization Machine (AFM), which learns the importance of each feature interaction from data via a neural attention network. Extensive experiments on two real-world datasets demonstrate the effectiveness of AFM. Empirically, it is shown on regression task AFM betters FM with a $8.6\%$ relative improvement, and consistently outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross with a much simpler structure and fewer model parameters. Our implementation of AFM is publicly available at: this https URL

Cite

Text

Xiao et al. "Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/435

Markdown

[Xiao et al. "Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/xiao2017ijcai-attentional/) doi:10.24963/IJCAI.2017/435

BibTeX

@inproceedings{xiao2017ijcai-attentional,
  title     = {{Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks}},
  author    = {Xiao, Jun and Ye, Hao and He, Xiangnan and Zhang, Hanwang and Wu, Fei and Chua, Tat-Seng},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {3119-3125},
  doi       = {10.24963/IJCAI.2017/435},
  url       = {https://mlanthology.org/ijcai/2017/xiao2017ijcai-attentional/}
}