Exploiting Feature Covariance in High-Dimensional Online Learning
Abstract
Some online algorithms for linear classification model the uncertainty in their weights over the course of learning. Modeling the full covariance structure of the weights can provide a significant advantage for classification. However, for high-dimensional, large-scale data, even though there may be many second-order feature interactions, it is computationally infeasible to maintain this covariance structure. To extend second-order methods to high-dimensional data, we develop low-rank approximations of the covariance structure. We evaluate our approach on both synthetic and real-world data sets using the confidence-weighted online learning framework. We show improvements over diagonal covariance matrices for both low and high-dimensional data.
Cite
Text
Ma et al. "Exploiting Feature Covariance in High-Dimensional Online Learning." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.Markdown
[Ma et al. "Exploiting Feature Covariance in High-Dimensional Online Learning." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.](https://mlanthology.org/aistats/2010/ma2010aistats-exploiting/)BibTeX
@inproceedings{ma2010aistats-exploiting,
title = {{Exploiting Feature Covariance in High-Dimensional Online Learning}},
author = {Ma, Justin and Kulesza, Alex and Dredze, Mark and Crammer, Koby and Saul, Lawrence and Pereira, Fernando},
booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics},
year = {2010},
pages = {493-500},
volume = {9},
url = {https://mlanthology.org/aistats/2010/ma2010aistats-exploiting/}
}