Non-Stationary Dynamic Bayesian Networks
Abstract
A principled mechanism for identifying conditional dependencies in time-series data is provided through structure learning of dynamic Bayesian networks (DBNs). An important assumption of DBN structure learning is that the data are generated by a stationary processâan assumption that is not true in many important settings. In this paper, we introduce a new class of graphical models called non-stationary dynamic Bayesian networks, in which the conditional dependence structure of the underlying data-generation process is permitted to change over time. Non-stationary dynamic Bayesian networks represent a new framework for studying problems in which the structure of a network is evolving over time. We define the non-stationary DBN model, present an MCMC sampling algorithm for learning the structure of the model from time-series data under different assumptions, and demonstrate the effectiveness of the algorithm on both simulated and biological data.
Cite
Text
Robinson and Hartemink. "Non-Stationary Dynamic Bayesian Networks." Neural Information Processing Systems, 2008.Markdown
[Robinson and Hartemink. "Non-Stationary Dynamic Bayesian Networks." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/robinson2008neurips-nonstationary/)BibTeX
@inproceedings{robinson2008neurips-nonstationary,
title = {{Non-Stationary Dynamic Bayesian Networks}},
author = {Robinson, Joshua W. and Hartemink, Alexander J.},
booktitle = {Neural Information Processing Systems},
year = {2008},
pages = {1369-1376},
url = {https://mlanthology.org/neurips/2008/robinson2008neurips-nonstationary/}
}