Unsupervised Domain Adaptation with Feature Embeddings

Abstract

Representation learning is the dominant technique for unsupervised domain adaptation, but existing approaches often require the specification of "pivot features" that generalize across domains, which are selected by task-specific heuristics. We show that a novel but simple feature embedding approach provides better performance, by exploiting the feature template structure common in NLP problems.

Cite

Text

Yang and Eisenstein. "Unsupervised Domain Adaptation with Feature Embeddings." International Conference on Learning Representations, 2015.

Markdown

[Yang and Eisenstein. "Unsupervised Domain Adaptation with Feature Embeddings." International Conference on Learning Representations, 2015.](https://mlanthology.org/iclr/2015/yang2015iclr-unsupervised/)

BibTeX

@inproceedings{yang2015iclr-unsupervised,
  title     = {{Unsupervised Domain Adaptation with Feature Embeddings}},
  author    = {Yang, Yi and Eisenstein, Jacob},
  booktitle = {International Conference on Learning Representations},
  year      = {2015},
  url       = {https://mlanthology.org/iclr/2015/yang2015iclr-unsupervised/}
}