Importance Weighted Generative Networks
Abstract
Deep generative networks can simulate from a complex target distribution, by minimizing a loss with respect to samples from that distribution. However, often we do not have direct access to our target distribution - our data may be subject to sample selection bias, or may be from a different but related distribution. We present methods based on importance weighting that can estimate the loss with respect to a target distribution, even if we cannot access that distribution directly, in a variety of settings. These estimators, which differentially weight the contribution of data to the loss function, offer both theoretical guarantees and impressive empirical performance.
Cite
Text
Diesendruck et al. "Importance Weighted Generative Networks." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2019. doi:10.1007/978-3-030-46147-8_15Markdown
[Diesendruck et al. "Importance Weighted Generative Networks." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2019.](https://mlanthology.org/ecmlpkdd/2019/diesendruck2019ecmlpkdd-importance/) doi:10.1007/978-3-030-46147-8_15BibTeX
@inproceedings{diesendruck2019ecmlpkdd-importance,
title = {{Importance Weighted Generative Networks}},
author = {Diesendruck, Maurice and Elenberg, Ethan R. and Sen, Rajat and Cole, Guy W. and Shakkottai, Sanjay and Williamson, Sinead A.},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2019},
pages = {249-265},
doi = {10.1007/978-3-030-46147-8_15},
url = {https://mlanthology.org/ecmlpkdd/2019/diesendruck2019ecmlpkdd-importance/}
}