A Simple and Effective Method for Incorporating Advice into Kernel Methods
Abstract
We propose a simple mechanism for incorporating advice (prior knowledge), in the form of simple rules, into support-vector methods for both classification and regression. Our approach is based on introducing inequality constraints associated with datapoints that match the advice. These constrained datapoints can be standard examples in the training set, but can also be unlabeled data in a semi-supervised, advice-taking approach. Our new approach is simpler to implement and more efficiently solved than the knowledge-based support vector classification methods of Fung, Mangasarian and Shavlik (2002; 2003) and the knowledge-based support vector regression method of Mangasarian, Shavlik, and Wild (2004), while performing approximately as well as these more complex approaches. Experiments using our new approach on a synthetic task and a reinforcementlearning problem within the RoboCup soccer simulator show that our advice-taking method can significantly outperform a method without advice and perform similarly to prior advice-taking, support-vector machines.
Cite
Text
Maclin et al. "A Simple and Effective Method for Incorporating Advice into Kernel Methods." AAAI Conference on Artificial Intelligence, 2006.Markdown
[Maclin et al. "A Simple and Effective Method for Incorporating Advice into Kernel Methods." AAAI Conference on Artificial Intelligence, 2006.](https://mlanthology.org/aaai/2006/maclin2006aaai-simple/)BibTeX
@inproceedings{maclin2006aaai-simple,
title = {{A Simple and Effective Method for Incorporating Advice into Kernel Methods}},
author = {Maclin, Richard and Shavlik, Jude W. and Walker, Trevor and Torrey, Lisa},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2006},
pages = {427-432},
url = {https://mlanthology.org/aaai/2006/maclin2006aaai-simple/}
}