Conditional Loss and Deep Euler Scheme for Time Series Generation
Abstract
We introduce three new generative models for time series that are based on Euler discretization of Stochastic Differential Equations (SDEs) and Wasserstein metrics. Two of these methods rely on the adaptation of generative adversarial networks (GANs) to time series. The third algorithm, called Conditional Euler Generator (CEGEN), minimizes a dedicated distance between the transition probability distributions over all time steps. In the context of Itô processes, we provide theoretical guarantees that minimizing this criterion implies accurate estimations of the drift and volatility parameters. Empirically, CEGEN outperforms state-of-the-art and GANs on both marginal and temporal dynamic metrics. Besides, correlation structures are accurately identified in high dimension. When few real data points are available, we verify the effectiveness of CEGEN when combined with transfer learning methods on model-based simulations. Finally, we illustrate the robustness of our methods on various real-world data sets.
Cite
Text
Remlinger et al. "Conditional Loss and Deep Euler Scheme for Time Series Generation." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I7.20782Markdown
[Remlinger et al. "Conditional Loss and Deep Euler Scheme for Time Series Generation." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/remlinger2022aaai-conditional/) doi:10.1609/AAAI.V36I7.20782BibTeX
@inproceedings{remlinger2022aaai-conditional,
title = {{Conditional Loss and Deep Euler Scheme for Time Series Generation}},
author = {Remlinger, Carl and Mikael, Joseph and Elie, Romuald},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {8098-8105},
doi = {10.1609/AAAI.V36I7.20782},
url = {https://mlanthology.org/aaai/2022/remlinger2022aaai-conditional/}
}