Learning from Measurements in Exponential Families
Abstract
Given a model family and a set of unlabeled examples, one could either label specific examples or state general constraints---both provide information about the desired model. In general, what is the most cost-effective way to learn? To address this question, we introduce measurements, a general class of mechanisms for providing information about a target model. We present a Bayesian decision-theoretic framework, which allows us to both integrate diverse measurements and choose new measurements to make. We use a variational inference algorithm, which exploits exponential family duality. The merits of our approach are demonstrated on two sequence labeling tasks.
Cite
Text
Liang et al. "Learning from Measurements in Exponential Families." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553457Markdown
[Liang et al. "Learning from Measurements in Exponential Families." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/liang2009icml-learning/) doi:10.1145/1553374.1553457BibTeX
@inproceedings{liang2009icml-learning,
title = {{Learning from Measurements in Exponential Families}},
author = {Liang, Percy and Jordan, Michael I. and Klein, Dan},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {641-648},
doi = {10.1145/1553374.1553457},
url = {https://mlanthology.org/icml/2009/liang2009icml-learning/}
}