DNNLasso: Scalable Graph Learning for Matrix-Variate Data
Abstract
We consider the problem of jointly learning row-wise and column-wise dependencies of matrix-variate observations, which are modelled separately by two precision matrices. Due to the complicated structure of Kronecker-product precision matrices in the commonly used matrix-variate Gaussian graphical models, a sparser Kronecker-sum structure was proposed recently based on the Cartesian product of graphs. However, existing methods for estimating Kronecker-sum structured precision matrices do not scale well to large scale datasets. In this paper, we introduce DNNLasso, a diagonally non-negative graphical lasso model for estimating the Kronecker-sum structured precision matrix, which outperforms the state-of-the-art methods by a large margin in both accuracy and computational time.
Cite
Text
Lin and Zhang. "DNNLasso: Scalable Graph Learning for Matrix-Variate Data." Artificial Intelligence and Statistics, 2024.Markdown
[Lin and Zhang. "DNNLasso: Scalable Graph Learning for Matrix-Variate Data." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/lin2024aistats-dnnlasso/)BibTeX
@inproceedings{lin2024aistats-dnnlasso,
title = {{DNNLasso: Scalable Graph Learning for Matrix-Variate Data}},
author = {Lin, Meixia and Zhang, Yangjing},
booktitle = {Artificial Intelligence and Statistics},
year = {2024},
pages = {316-324},
volume = {238},
url = {https://mlanthology.org/aistats/2024/lin2024aistats-dnnlasso/}
}