A Variational Autoencoder for Neural Temporal Point Processes with Dynamic Latent Graphs
Abstract
Continuously observed event occurrences, often exhibit self and mutually exciting effects, which can be well modeled using temporal point processes. Beyond that, these event dynamics may also change over time, with certain periodic trends. We propose a novel variational autoencoder to capture such a mixture of temporal dynamics. More specifically, the whole time interval of the input sequence is partitioned into a set of sub intervals. The event dynamics are assumed to be stationary within each subinterval, but could be changing across those subintervals. In particular, we use a sequential latent variable model to learn a dependency graph between the observed dimensions, for each subinterval. The model predicts the future event times, by using the learned dependency graph to remove the non contributing influences of past events. By doing so, the proposed model demonstrates its higher accuracy in predicting inter event times and event types for several real world event sequences, compared with existing state of the art neural point processes.
Cite
Text
Yang and Zha. "A Variational Autoencoder for Neural Temporal Point Processes with Dynamic Latent Graphs." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I15.29570Markdown
[Yang and Zha. "A Variational Autoencoder for Neural Temporal Point Processes with Dynamic Latent Graphs." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/yang2024aaai-variational/) doi:10.1609/AAAI.V38I15.29570BibTeX
@inproceedings{yang2024aaai-variational,
title = {{A Variational Autoencoder for Neural Temporal Point Processes with Dynamic Latent Graphs}},
author = {Yang, Sikun and Zha, Hongyuan},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {16343-16351},
doi = {10.1609/AAAI.V38I15.29570},
url = {https://mlanthology.org/aaai/2024/yang2024aaai-variational/}
}