Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics

Abstract

Feature hashing is widely used to process large scale sparse features for learning of predictive models. Collisions inherently happen in the hashing process and hurt the model performance. In this paper, we develop a feature hashing scheme called Cuckoo Feature Hashing(CCFH) based on the principle behind Cuckoo hashing, a hashing scheme designed to resolve collisions. By providing multiple possible hash locations for each feature, CCFH prevents the collisions between predictive features by dynamically hashing them into alternative locations during model training. Experimental results on prediction tasks with hundred-millions of features demonstrate that CCFH can achieve the same level of performance by using only 15%-25% parameters compared with conventional feature hashing.

Cite

Text

Gao et al. "Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/295

Markdown

[Gao et al. "Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/gao2018ijcai-cuckoo/) doi:10.24963/IJCAI.2018/295

BibTeX

@inproceedings{gao2018ijcai-cuckoo,
  title     = {{Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics}},
  author    = {Gao, Jinyang and Ooi, Beng Chin and Shen, Yanyan and Lee, Wang-Chien},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {2135-2141},
  doi       = {10.24963/IJCAI.2018/295},
  url       = {https://mlanthology.org/ijcai/2018/gao2018ijcai-cuckoo/}
}