Exploring a Principled Framework for Deep Subspace Clustering
Abstract
Subspace clustering is a classical unsupervised learning task, built on a basic assumption that high-dimensional data can be approximated by a union of subspaces (UoS). Nevertheless, the real-world data are often deviating from the UoS assumption. To address this challenge, state-of-the-art deep subspace clustering algorithms attempt to jointly learn UoS representations and self-expressive coefficients. However, the general framework of the existing algorithms suffers from feature collapse and lacks a theoretical guarantee to learn desired UoS representation. In this paper, we present a Principled fRamewOrk for Deep Subspace Clustering (PRO-DSC), which is designed to learn structured representations and self-expressive coefficients in a unified manner. Specifically, in PRO-DSC, we incorporate an effective regularization on the learned representations into the self-expressive model, prove that the regularized self-expressive model is able to prevent feature space collapse, and demonstrate that the learned optimal representations under certain condition lie on a union of orthogonal subspaces. Moreover, we provide a scalable and efficient approach to implement our PRO-DSC and conduct extensive experiments to verify our theoretical findings and demonstrate the superior performance of our proposed deep subspace clustering approach.
Cite
Text
Meng et al. "Exploring a Principled Framework for Deep Subspace Clustering." International Conference on Learning Representations, 2025.Markdown
[Meng et al. "Exploring a Principled Framework for Deep Subspace Clustering." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/meng2025iclr-exploring/)BibTeX
@inproceedings{meng2025iclr-exploring,
title = {{Exploring a Principled Framework for Deep Subspace Clustering}},
author = {Meng, Xianghan and Huang, Zhiyuan and He, Wei and Qi, Xianbiao and Xiao, Rong and Li, Chun-Guang},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/meng2025iclr-exploring/}
}