Learning with Missing Features
Abstract
We introduce new online and batch algorithms that are robust to data with missing features, a situation that arises in many practical applications. In the online setup, we allow for the comparison hypothesis to change as a function of the subset of features that is observed on any given round, extending the standard setting where the comparison hypothesis is fixed throughout. In the batch setup, we present a convex relaxation of a non-convex problem to jointly estimate an imputation function, used to fill in the values of missing features, along with the classification hypothesis. We prove regret bounds in the online setting and Rademacher complexity bounds for the batch i.i.d. setting. The algorithms are tested on several UCI datasets, showing superior performance over baseline imputation methods.
Cite
Text
Rostamizadeh et al. "Learning with Missing Features." Conference on Uncertainty in Artificial Intelligence, 2011.Markdown
[Rostamizadeh et al. "Learning with Missing Features." Conference on Uncertainty in Artificial Intelligence, 2011.](https://mlanthology.org/uai/2011/rostamizadeh2011uai-learning/)BibTeX
@inproceedings{rostamizadeh2011uai-learning,
title = {{Learning with Missing Features}},
author = {Rostamizadeh, Afshin and Agarwal, Alekh and Bartlett, Peter L.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2011},
pages = {635-642},
url = {https://mlanthology.org/uai/2011/rostamizadeh2011uai-learning/}
}