End-to-End Differentiable Clustering with Associative Memories
Abstract
Clustering is a widely used unsupervised learning technique involving an intensive discrete optimization problem. Associative Memory models or AMs are differentiable neural networks defining a recursive dynamical system, which have been integrated with various deep learning architectures. We uncover a novel connection between the AM dynamics and the inherent discrete assignment necessary in clustering to propose a novel unconstrained continuous relaxation of the discrete clustering problem, enabling end-to-end differentiable clustering with AM, dubbed ClAM. Leveraging the pattern completion ability of AMs, we further develop a novel self-supervised clustering loss. Our evaluations on varied datasets demonstrate that ClAM benefits from the self-supervision, and significantly improves upon both the traditional Lloyd's k-means algorithm, and more recent continuous clustering relaxations (by upto 60\% in terms of the Silhouette Coefficient).
Cite
Text
Saha et al. "End-to-End Differentiable Clustering with Associative Memories." ICML 2023 Workshops: Differentiable_Almost_Everything, 2023.Markdown
[Saha et al. "End-to-End Differentiable Clustering with Associative Memories." ICML 2023 Workshops: Differentiable_Almost_Everything, 2023.](https://mlanthology.org/icmlw/2023/saha2023icmlw-endtoend/)BibTeX
@inproceedings{saha2023icmlw-endtoend,
title = {{End-to-End Differentiable Clustering with Associative Memories}},
author = {Saha, Bishwajit and Krotov, Dmitry and Zaki, Mohammed J and Ram, Parikshit},
booktitle = {ICML 2023 Workshops: Differentiable_Almost_Everything},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/saha2023icmlw-endtoend/}
}