Kernel Representation Learning for Time Sequence: Algorithm, Theory, and Applications

Abstract

Time sequences are essential in fields such as finance, healthcare, and environmental science, where understanding temporal dependencies and making accurate predictions are crucial. These sequences often exhibit complexities like nonlinearity, noise, and concept drift. Traditional models struggle to capture the intricate dynamics of multivariate and co-evolving sequences, particularly in contexts where relationships between variables shift unpredictably. This thesis introduces a range of Kernel Representation Learning (KRL) methodologies to address these challenges. We develop kernel self-representation learning to capture the temporal dependencies and hidden structures, while identifying concept drift in co-evolving sequences. Additionally, we explore theoretical connections between KRL and advanced deep-learning models. The proposed methods are validated through real-world applications, showing improvements in predictive accuracy, interpretability, and robustness.

Cite

Text

Xu. "Kernel Representation Learning for Time Sequence: Algorithm, Theory, and Applications." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I28.35232

Markdown

[Xu. "Kernel Representation Learning for Time Sequence: Algorithm, Theory, and Applications." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/xu2025aaai-kernel/) doi:10.1609/AAAI.V39I28.35232

BibTeX

@inproceedings{xu2025aaai-kernel,
  title     = {{Kernel Representation Learning for Time Sequence: Algorithm, Theory, and Applications}},
  author    = {Xu, Kunpeng},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {29309-29310},
  doi       = {10.1609/AAAI.V39I28.35232},
  url       = {https://mlanthology.org/aaai/2025/xu2025aaai-kernel/}
}