Temporal Belief Memory: Imputing Missing Data During RNN Training

Abstract

We propose a bio-inspired approach named Temporal Belief Memory (TBM) for handling missing data with recurrent neural networks (RNNs). When modeling irregularly observed temporal sequences, conventional RNNs generally ignore the real-time intervals between consecutive observations. TBM is a missing value imputation method that considers the time continuity and captures latent missing patterns based on irregular real time intervals of the inputs. We evaluate our TBM approach with real-world electronic health records (EHRs) consisting of 52,919 visits and 4,224,567 events on a task of early prediction of septic shock. We compare TBM against multiple baselines including both domain experts' rules and the state-of-the-art missing data handling approach using both RNN and long-short term memory. The experimental results show that TBM outperforms all the competitive baseline approaches for the septic shock early prediction task.

Cite

Text

Kim and Chi. "Temporal Belief Memory: Imputing Missing Data During RNN Training." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/322

Markdown

[Kim and Chi. "Temporal Belief Memory: Imputing Missing Data During RNN Training." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/kim2018ijcai-temporal/) doi:10.24963/IJCAI.2018/322

BibTeX

@inproceedings{kim2018ijcai-temporal,
  title     = {{Temporal Belief Memory: Imputing Missing Data During RNN Training}},
  author    = {Kim, Yeo-Jin and Chi, Min},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {2326-2332},
  doi       = {10.24963/IJCAI.2018/322},
  url       = {https://mlanthology.org/ijcai/2018/kim2018ijcai-temporal/}
}