A Global Geometric Analysis of Maximal Coding Rate Reduction
Abstract
The maximal coding rate reduction (MCR$^2$) objective for learning structured and compact deep representations is drawing increasing attention, especially after its recent usage in the derivation of fully explainable and highly effective deep network architectures. However, it lacks a complete theoretical justification: only the properties of its global optima are known, and its global landscape has not been studied. In this work, we give a complete characterization of the properties of all its local and global optima as well as other types of critical points. Specifically, we show that each (local or global) maximizer of the MCR$^2$ problem corresponds to a low-dimensional, discriminative, and diverse representation, and furthermore, each critical point of the objective is either a local maximizer or a strict saddle point. Such a favorable landscape makes MCR$^2$ a natural choice of objective for learning diverse and discriminative representations via first-order optimization. To further verify our theoretical findings, we illustrate these properties with extensive experiments on both synthetic and real data sets.
Cite
Text
Wang et al. "A Global Geometric Analysis of Maximal Coding Rate Reduction." International Conference on Machine Learning, 2024.Markdown
[Wang et al. "A Global Geometric Analysis of Maximal Coding Rate Reduction." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/wang2024icml-global/)BibTeX
@inproceedings{wang2024icml-global,
title = {{A Global Geometric Analysis of Maximal Coding Rate Reduction}},
author = {Wang, Peng and Liu, Huikang and Pai, Druv and Yu, Yaodong and Zhu, Zhihui and Qu, Qing and Ma, Yi},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {51012-51040},
volume = {235},
url = {https://mlanthology.org/icml/2024/wang2024icml-global/}
}