End-to-End Differentiable Clustering with Associative Memories
Abstract
Clustering is a widely used unsupervised learning technique involving an intensive discrete optimization problem. Associative Memory models or AMs are differentiable neural networks defining a recursive dynamical system, which have been integrated with various deep learning architectures. We uncover a novel connection between the AM dynamics and the inherent discrete assignment necessary in clustering to propose a novel unconstrained continuous relaxation of the discrete clustering problem, enabling end-to-end differentiable clustering with AM, dubbed ClAM. Leveraging the pattern completion ability of AMs, we further develop a novel self-supervised clustering loss. Our evaluations on varied datasets demonstrate that ClAM benefits from the self-supervision, and significantly improves upon both the traditional Lloyd’s k-means algorithm, and more recent continuous clustering relaxations (by upto 60% in terms of the Silhouette Coefficient).
Cite
Text
Saha et al. "End-to-End Differentiable Clustering with Associative Memories." International Conference on Machine Learning, 2023.Markdown
[Saha et al. "End-to-End Differentiable Clustering with Associative Memories." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/saha2023icml-endtoend/)BibTeX
@inproceedings{saha2023icml-endtoend,
title = {{End-to-End Differentiable Clustering with Associative Memories}},
author = {Saha, Bishwajit and Krotov, Dmitry and Zaki, Mohammed J and Ram, Parikshit},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {29649-29670},
volume = {202},
url = {https://mlanthology.org/icml/2023/saha2023icml-endtoend/}
}