Consistent Amortized Clustering via Generative Flow Networks
Abstract
Neural models for amortized probabilistic clustering yield samples of cluster labels given a set-structured input, while avoiding lengthy Markov chain runs and the need for explicit data likelihoods. Existing methods which label each data point sequentially, like the Neural Clustering Process, often lead to cluster assignments highly dependent on the data order. Alternatively, methods that sequentially create full clusters, do not provide assignment probabilities. In this paper, we introduce GFNCP, a novel framework for amortized clustering. GFNCP is formulated as a Generative Flow Network with a shared energy-based parametrization of policy and reward. We show that the flow matching conditions are equivalent to consistency of the clustering posterior under marginalization, which in turn implies order invariance. GFNCP also outperforms existing methods in clustering performance on both synthetic and real-world data.
Cite
Text
Chelly et al. "Consistent Amortized Clustering via Generative Flow Networks." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Chelly et al. "Consistent Amortized Clustering via Generative Flow Networks." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/chelly2025aistats-consistent/)BibTeX
@inproceedings{chelly2025aistats-consistent,
title = {{Consistent Amortized Clustering via Generative Flow Networks}},
author = {Chelly, Irit and Uziel, Roy and Freifeld, Oren and Pakman, Ari},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {1729-1737},
volume = {258},
url = {https://mlanthology.org/aistats/2025/chelly2025aistats-consistent/}
}