Path Kernels and Multiplicative Updates
Abstract
We consider a natural convolution kernel defined by a directed graph. Each edge contributes an input. The inputs along a path form a product and the products for all paths are summed. We also have a set of probabilities on the edges so that the outflow from each node is one. We then discuss multiplicative updates on these graphs where the prediction is essentially a kernel computation and the update contributes a factor to each edge. Now the total outflow out of each node is not one any more. However some clever algorithms re-normalize the weights on the paths so that the total outflow out of each node is one again. Finally we discuss the use of regular expressions for speeding up the kernel and re-normalization computation. In particular we rewrite the multiplicative algorithms that predict as well as the best pruning of a series parallel graph in terms of efficient kernel computations.
Cite
Text
Takimoto and Warmuth. "Path Kernels and Multiplicative Updates." Annual Conference on Computational Learning Theory, 2002. doi:10.1007/3-540-45435-7_6Markdown
[Takimoto and Warmuth. "Path Kernels and Multiplicative Updates." Annual Conference on Computational Learning Theory, 2002.](https://mlanthology.org/colt/2002/takimoto2002colt-path/) doi:10.1007/3-540-45435-7_6BibTeX
@inproceedings{takimoto2002colt-path,
title = {{Path Kernels and Multiplicative Updates}},
author = {Takimoto, Eiji and Warmuth, Manfred K.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2002},
pages = {74-89},
doi = {10.1007/3-540-45435-7_6},
url = {https://mlanthology.org/colt/2002/takimoto2002colt-path/}
}