Cross-Domain Multitask Learning with Latent Probit Models

Abstract

Learning multiple tasks across heterogeneous domains is a challenging problem since the feature space may not be the same for different tasks. We assume the data in multiple tasks are generated from a latent common domain via sparse domain transforms and propose a latent probit model (LPM) to jointly learn the domain transforms, and a probit classifier shared in the common domain. To learn meaningful task relatedness and avoid over-fitting in classification, we introduce sparsity in the domain transforms matrices, as well as in the common classifier parameters. We derive theoretical bounds for the estimation error of the classifier parameters in terms of the sparsity of domain transform matrices. An expectation-maximization algorithm is derived for learning the LPM. The effectiveness of the approach is demonstrated on several real datasets.

Cite

Text

Han et al. "Cross-Domain Multitask Learning with Latent Probit Models." International Conference on Machine Learning, 2012.

Markdown

[Han et al. "Cross-Domain Multitask Learning with Latent Probit Models." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/han2012icml-cross/)

BibTeX

@inproceedings{han2012icml-cross,
  title     = {{Cross-Domain Multitask Learning with Latent Probit Models}},
  author    = {Han, Shaobo and Liao, Xuejun and Carin, Lawrence},
  booktitle = {International Conference on Machine Learning},
  year      = {2012},
  url       = {https://mlanthology.org/icml/2012/han2012icml-cross/}
}