Efficient Convolution Kernels for Dependency and Constituent Syntactic Trees
Abstract
In this paper, we provide a study on the use of tree kernels to encode syntactic parsing information in natural language learning. In particular, we propose a new convolution kernel, namely the Partial Tree (PT) kernel, to fully exploit dependency trees. We also propose an efficient algorithm for its computation which is futhermore sped-up by applying the selection of tree nodes with non-null kernel. The experiments with Support Vector Machines on the task of semantic role labeling and question classification show that (a) the kernel running time is linear on the average case and (b) the PT kernel improves on the other tree kernels when applied to the appropriate parsing paradigm.
Cite
Text
Moschitti. "Efficient Convolution Kernels for Dependency and Constituent Syntactic Trees." European Conference on Machine Learning, 2006. doi:10.1007/11871842_32Markdown
[Moschitti. "Efficient Convolution Kernels for Dependency and Constituent Syntactic Trees." European Conference on Machine Learning, 2006.](https://mlanthology.org/ecmlpkdd/2006/moschitti2006ecml-efficient/) doi:10.1007/11871842_32BibTeX
@inproceedings{moschitti2006ecml-efficient,
title = {{Efficient Convolution Kernels for Dependency and Constituent Syntactic Trees}},
author = {Moschitti, Alessandro},
booktitle = {European Conference on Machine Learning},
year = {2006},
pages = {318-329},
doi = {10.1007/11871842_32},
url = {https://mlanthology.org/ecmlpkdd/2006/moschitti2006ecml-efficient/}
}