G-GLformer: Transformer with GRU Embedding and Global-Local Attention for Multivariate Time Series Forecasting

Cite

Text

Yu et al. "G-GLformer: Transformer with GRU Embedding and Global-Local Attention for Multivariate Time Series Forecasting." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025. doi:10.1007/978-3-662-72243-5_3

Markdown

[Yu et al. "G-GLformer: Transformer with GRU Embedding and Global-Local Attention for Multivariate Time Series Forecasting." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025.](https://mlanthology.org/ecmlpkdd/2025/yu2025ecmlpkdd-gglformer/) doi:10.1007/978-3-662-72243-5_3

BibTeX

@inproceedings{yu2025ecmlpkdd-gglformer,
  title     = {{G-GLformer: Transformer with GRU Embedding and Global-Local Attention for Multivariate Time Series Forecasting}},
  author    = {Yu, Wenjun and Li, Jiyanglin and Gao, Wentao and Zhuang, Niangxi and Li, Wen and Du, Shouguo},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2025},
  pages     = {41-56},
  doi       = {10.1007/978-3-662-72243-5_3},
  url       = {https://mlanthology.org/ecmlpkdd/2025/yu2025ecmlpkdd-gglformer/}
}