Global Optimality in Bivariate Gradient-Based DAG Learning

Abstract

Recently, a new class of non-convex optimization problems motivated by the statistical problem of learning an acyclic directed graphical model from data has attracted significant interest. While existing work uses standard first-order optimization schemes to solve this problem, proving the global optimality of such approaches has proven elusive. The difficulty lies in the fact that unlike other non-convex problems in the literature, this problem is not "benign", and possesses multiple spurious solutions that standard approaches can easily get trapped in. In this paper, we prove that a simple path-following optimization scheme globally converges to the global minimum of the population loss in the bivariate setting.

Cite

Text

Deng et al. "Global Optimality in Bivariate Gradient-Based DAG Learning." ICML 2023 Workshops: SODS, 2023.

Markdown

[Deng et al. "Global Optimality in Bivariate Gradient-Based DAG Learning." ICML 2023 Workshops: SODS, 2023.](https://mlanthology.org/icmlw/2023/deng2023icmlw-global/)

BibTeX

@inproceedings{deng2023icmlw-global,
  title     = {{Global Optimality in Bivariate Gradient-Based DAG Learning}},
  author    = {Deng, Chang and Bello, Kevin and Ravikumar, Pradeep Kumar and Aragam, Bryon},
  booktitle = {ICML 2023 Workshops: SODS},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/deng2023icmlw-global/}
}