Exponentially Convergent Algorithms for Supervised Matrix Factorization
Abstract
Supervised matrix factorization (SMF) is a classical machine learning method that simultaneously seeks feature extraction and classification tasks, which are not necessarily a priori aligned objectives. Our goal is to use SMF to learn low-rank latent factors that offer interpretable, data-reconstructive, and class-discriminative features, addressing challenges posed by high-dimensional data. Training SMF model involves solving a nonconvex and possibly constrained optimization with at least three blocks of parameters. Known algorithms are either heuristic or provide weak convergence guarantees for special cases. In this paper, we provide a novel framework that `lifts' SMF as a low-rank matrix estimation problem in a combined factor space and propose an efficient algorithm that provably converges exponentially fast to a global minimizer of the objective with arbitrary initialization under mild assumptions. Our framework applies to a wide range of SMF-type problems for multi-class classification with auxiliary features. To showcase an application, we demonstrate that our algorithm successfully identified well-known cancer-associated gene groups for various cancers.
Cite
Text
Lee et al. "Exponentially Convergent Algorithms for Supervised Matrix Factorization." Neural Information Processing Systems, 2023.Markdown
[Lee et al. "Exponentially Convergent Algorithms for Supervised Matrix Factorization." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/lee2023neurips-exponentially/)BibTeX
@inproceedings{lee2023neurips-exponentially,
title = {{Exponentially Convergent Algorithms for Supervised Matrix Factorization}},
author = {Lee, Joowon and Lyu, Hanbaek and Yao, Weixin},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/lee2023neurips-exponentially/}
}