Infinite-Dimensional Feature Interaction

Abstract

The past neural network design has largely focused on feature \textit{representation space} dimension and its capacity scaling (e.g., width, depth), but overlooked the feature \textit{interaction space} scaling. Recent advancements have shown shifted focus towards element-wise multiplication to facilitate higher-dimensional feature interaction space for better information transformation. Despite this progress, multiplications predominantly capture low-order interactions, thus remaining confined to a finite-dimensional interaction space. To transcend this limitation, classic kernel methods emerge as a promising solution to engage features in an infinite-dimensional space. We introduce InfiNet, a model architecture that enables feature interaction within an infinite-dimensional space created by RBF kernel. Our experiments reveal that InfiNet achieves new state-of-the-art, owing to its capability to leverage infinite-dimensional interactions, significantly enhancing model performance.

Cite

Text

Xu et al. "Infinite-Dimensional Feature Interaction." Neural Information Processing Systems, 2024. doi:10.52202/079017-2933

Markdown

[Xu et al. "Infinite-Dimensional Feature Interaction." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/xu2024neurips-infinitedimensional/) doi:10.52202/079017-2933

BibTeX

@inproceedings{xu2024neurips-infinitedimensional,
  title     = {{Infinite-Dimensional Feature Interaction}},
  author    = {Xu, Chenhui and Yu, Fuxun and Li, Maoliang and Zheng, Zihao and Xu, Zirui and Xiong, Jinjun and Chen, Xiang},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2933},
  url       = {https://mlanthology.org/neurips/2024/xu2024neurips-infinitedimensional/}
}