High-Dimensional Graphical Model Selection Using $\ell_1$-Regularized Logistic Regression
Abstract
We focus on the problem of estimating the graph structure associated with a discrete Markov random field. We describe a method based on 1- regularized logistic regression, in which the neighborhood of any given node is estimated by performing logistic regression subject to an1-constraint. Our framework applies to the high-dimensional setting, in which both the number of nodes p and maximum neighborhood sizes d are allowed to grow as a function of the number of observations n. Our main result is to estab- lish sufficient conditions on the triple (n, p, d) for the method to succeed in consistently estimating the neighborhood of every node in the graph simul- taneously. Under certain mutual incoherence conditions analogous to those imposed in previous work on linear regression, we prove that consistent neighborhood selection can be obtained as long as the number of observa- tions n grows more quickly than 6d6 log d + 2d5 log p, thereby establishing that logarithmic growth in the number of samples n relative to graph size p is sufficient to achieve neighborhood consistency.
Cite
Text
Wainwright et al. "High-Dimensional Graphical Model Selection Using $\ell_1$-Regularized Logistic Regression." Neural Information Processing Systems, 2006.Markdown
[Wainwright et al. "High-Dimensional Graphical Model Selection Using $\ell_1$-Regularized Logistic Regression." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/wainwright2006neurips-highdimensional/)BibTeX
@inproceedings{wainwright2006neurips-highdimensional,
title = {{High-Dimensional Graphical Model Selection Using $\ell_1$-Regularized Logistic Regression}},
author = {Wainwright, Martin J. and Lafferty, John D. and Ravikumar, Pradeep K.},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {1465-1472},
url = {https://mlanthology.org/neurips/2006/wainwright2006neurips-highdimensional/}
}