Structured Graph Learning via Laplacian Spectral Constraints

Abstract

Learning a graph with a specific structure is essential for interpretability and identification of the relationships among data. But structured graph learning from observed samples is an NP-hard combinatorial problem. In this paper, we first show, for a set of important graph families it is possible to convert the combinatorial constraints of structure into eigenvalue constraints of the graph Laplacian matrix. Then we introduce a unified graph learning framework lying at the integration of the spectral properties of the Laplacian matrix with Gaussian graphical modeling, which is capable of learning structures of a large class of graph families. The proposed algorithms are provably convergent and practically amenable for big-data specific tasks. Extensive numerical experiments with both synthetic and real datasets demonstrate the effectiveness of the proposed methods. An R package containing codes for all the experimental results is submitted as a supplementary file.

Cite

Text

Kumar et al. "Structured Graph Learning via Laplacian Spectral Constraints." Neural Information Processing Systems, 2019.

Markdown

[Kumar et al. "Structured Graph Learning via Laplacian Spectral Constraints." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/kumar2019neurips-structured/)

BibTeX

@inproceedings{kumar2019neurips-structured,
  title     = {{Structured Graph Learning via Laplacian Spectral Constraints}},
  author    = {Kumar, Sandeep and Ying, Jiaxi and de Miranda Cardoso, Jose Vinicius and Palomar, Daniel},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {11651-11663},
  url       = {https://mlanthology.org/neurips/2019/kumar2019neurips-structured/}
}