Hedging Structured Concepts
Abstract
We develop an online algorithm called Component Hedge for learning structured concept classes when the loss of a structured concept sums over its components. Example classes include paths through a graph (composed of edges) and partial permutations (composed of assignments). The algorithm maintains a parameter vector with one non-negative weight per component, which always lies in the convex hull of the structured concept class. The algorithm predicts by decomposing the current parameter vector into a convex combination of concepts and choosing one of those concepts at random. The parameters are updated by first performing a multiplicative update and then projecting back into the convex hull. We show that Component Hedge has optimal regret bounds for a large variety of structured concept classes.
Cite
Text
Koolen et al. "Hedging Structured Concepts." Annual Conference on Computational Learning Theory, 2010.Markdown
[Koolen et al. "Hedging Structured Concepts." Annual Conference on Computational Learning Theory, 2010.](https://mlanthology.org/colt/2010/koolen2010colt-hedging/)BibTeX
@inproceedings{koolen2010colt-hedging,
title = {{Hedging Structured Concepts}},
author = {Koolen, Wouter M. and Warmuth, Manfred K. and Kivinen, Jyrki},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2010},
pages = {93-105},
url = {https://mlanthology.org/colt/2010/koolen2010colt-hedging/}
}