Learning Compact Representations of Time-Varying Processes
Abstract
We seek informative representations of the processes underlying time series data. As a first step, we address problems in which these processes can be approximated by linear models that vary smoothly over time. To facilitate estimation of these linear models, we introduce a method of dimension reduction which significantly reduces error when models are estimated locally for each point in time. This improvement is gained by performing dimension reduction implicitly through the model parameters rather than directly in the observation space.
Cite
Text
Bachman and Precup. "Learning Compact Representations of Time-Varying Processes." AAAI Conference on Artificial Intelligence, 2011. doi:10.1609/AAAI.V25I1.8061Markdown
[Bachman and Precup. "Learning Compact Representations of Time-Varying Processes." AAAI Conference on Artificial Intelligence, 2011.](https://mlanthology.org/aaai/2011/bachman2011aaai-learning/) doi:10.1609/AAAI.V25I1.8061BibTeX
@inproceedings{bachman2011aaai-learning,
title = {{Learning Compact Representations of Time-Varying Processes}},
author = {Bachman, Philip and Precup, Doina},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2011},
pages = {1748-1749},
doi = {10.1609/AAAI.V25I1.8061},
url = {https://mlanthology.org/aaai/2011/bachman2011aaai-learning/}
}