Kernel Change-Point Detection with Auxiliary Deep Generative Models
Abstract
Detecting the emergence of abrupt property changes in time series is a challenging problem. Kernel two-sample test has been studied for this task which makes fewer assumptions on the distributions than traditional parametric approaches. However, selecting kernels is non-trivial in practice. Although kernel selection for the two-sample test has been studied, the insufficient samples in change point detection problem hinder the success of those developed kernel selection algorithms. In this paper, we propose KL-CPD, a novel kernel learning framework for time series CPD that optimizes a lower bound of test power via an auxiliary generative model. With deep kernel parameterization, KL-CPD endows kernel two-sample test with the data-driven kernel to detect different types of change-points in real-world applications. The proposed approach significantly outperformed other state-of-the-art methods in our comparative evaluation of benchmark datasets and simulation studies.
Cite
Text
Chang et al. "Kernel Change-Point Detection with Auxiliary Deep Generative Models." International Conference on Learning Representations, 2019.Markdown
[Chang et al. "Kernel Change-Point Detection with Auxiliary Deep Generative Models." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/chang2019iclr-kernel/)BibTeX
@inproceedings{chang2019iclr-kernel,
title = {{Kernel Change-Point Detection with Auxiliary Deep Generative Models}},
author = {Chang, Wei-Cheng and Li, Chun-Liang and Yang, Yiming and Póczos, Barnabás},
booktitle = {International Conference on Learning Representations},
year = {2019},
url = {https://mlanthology.org/iclr/2019/chang2019iclr-kernel/}
}