Deep Clustering with Associative Memories

Abstract

Deep clustering -- joint representation learning and latent space clustering -- is a well studied problem especially in computer vision and text processing under the deep learning framework. While the representation learning is generally differentiable, clustering is an inherently discrete optimization, requiring various approximations and regularizations to fit in a standard differentiable pipeline. This leads to a somewhat disjointed representation learning and clustering. Recently, Associative Memories were utilized in the end-to-end differentiable $\texttt{ClAM}$ clustering scheme (Saha et al. 2023). In this work, we show how Associative Memories enable a novel take on deep clustering, $\texttt{DClAM}$, simplifying the whole pipeline and tying together the representation learning and clustering more intricately. Our experiments showcase the advantage of $\texttt{DClAM}$, producing improved clustering quality regardless of the architecture choice (convolutional, residual or fully-connected) or data modality (images or text).

Cite

Text

Saha et al. "Deep Clustering with Associative Memories." NeurIPS 2024 Workshops: Compression, 2024.

Markdown

[Saha et al. "Deep Clustering with Associative Memories." NeurIPS 2024 Workshops: Compression, 2024.](https://mlanthology.org/neuripsw/2024/saha2024neuripsw-deep/)

BibTeX

@inproceedings{saha2024neuripsw-deep,
  title     = {{Deep Clustering with Associative Memories}},
  author    = {Saha, Bishwajit and Krotov, Dmitry and Zaki, Mohammed J and Ram, Parikshit},
  booktitle = {NeurIPS 2024 Workshops: Compression},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/saha2024neuripsw-deep/}
}