Nearest Neighbor and Kernel Survival Analysis: Nonasymptotic Error Bounds and Strong Consistency Rates
Abstract
We establish the first nonasymptotic error bounds for Kaplan-Meier-based nearest neighbor and kernel survival probability estimators where feature vectors reside in metric spaces. Our bounds imply rates of strong consistency for these nonparametric estimators and, up to a log factor, match an existing lower bound for conditional CDF estimation. Our proof strategy also yields nonasymptotic guarantees for nearest neighbor and kernel variants of the Nelson-Aalen cumulative hazards estimator. We experimentally compare these methods on four datasets. We find that for the kernel survival estimator, a good choice of kernel is one learned using random survival forests.
Cite
Text
Chen. "Nearest Neighbor and Kernel Survival Analysis: Nonasymptotic Error Bounds and Strong Consistency Rates." International Conference on Machine Learning, 2019.Markdown
[Chen. "Nearest Neighbor and Kernel Survival Analysis: Nonasymptotic Error Bounds and Strong Consistency Rates." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/chen2019icml-nearest/)BibTeX
@inproceedings{chen2019icml-nearest,
title = {{Nearest Neighbor and Kernel Survival Analysis: Nonasymptotic Error Bounds and Strong Consistency Rates}},
author = {Chen, George},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {1001-1010},
volume = {97},
url = {https://mlanthology.org/icml/2019/chen2019icml-nearest/}
}