Convex Structure Learning in Log-Linear Models: Beyond Pairwise Potentials
Abstract
Previous work has examined structure learning in log-linear models with $\ell_1$-regularization, largely focusing on the case of pairwise potentials. In this work we consider the case of models with potentials of arbitrary order, but that satisfy a hierarchical constraint. We enforce the hierarchical constraint using group $\ell_1$-regularization with overlapping groups, and an active set method that enforces hierarchical inclusion allows us to tractably consider the exponential number of higher-order potentials. We use a spectral projected gradient method as a sub-routine for solving the overlapping group $\ell_1$-regularization problem, and make use of a sparse version of Dykstra’s algorithm to compute the projection. Our experiments indicate that this model gives equal or better test set likelihood compared to previous models.
Cite
Text
Schmidt and Murphy. "Convex Structure Learning in Log-Linear Models: Beyond Pairwise Potentials." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.Markdown
[Schmidt and Murphy. "Convex Structure Learning in Log-Linear Models: Beyond Pairwise Potentials." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.](https://mlanthology.org/aistats/2010/schmidt2010aistats-convex/)BibTeX
@inproceedings{schmidt2010aistats-convex,
title = {{Convex Structure Learning in Log-Linear Models: Beyond Pairwise Potentials}},
author = {Schmidt, Mark and Murphy, Kevin},
booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics},
year = {2010},
pages = {709-716},
volume = {9},
url = {https://mlanthology.org/aistats/2010/schmidt2010aistats-convex/}
}