AdaMixT: Adaptive Weighted Mixture of Multi-Scale Expert Transformers for Time Series Forecasting

Abstract

Multivariate time series forecasting involves predicting future values based on historical observations. However, existing approaches primarily rely on predefined single-scale patches or lack effective mechanisms for multi-scale feature fusion. These limitations hinder them from fully capturing the complex patterns inherent in time series, leading to constrained performance and insufficient generalizability. To address these challenges, we propose a novel architecture named Adaptive Weighted Mixture of Multi-Scale Expert Transformers (AdaMixT). Specifically, AdaMixT introduces various patches and leverages both General Pre-trained Models (GPM) and Domain-specific Models (DSM) for multi-scale feature extraction. To accommodate the heterogeneity of temporal features, AdaMixT incorporates a gating network that dynamically allocates weights among different experts, enabling more accurate predictions through adaptive multi-scale fusion. Comprehensive experiments on eight widely used benchmarks, including Weather, Traffic, Electricity, ILI, and four ETT datasets, consistently demonstrate the effectiveness of AdaMixT in real-world scenarios.

Cite

Text

Zhang et al. "AdaMixT: Adaptive Weighted Mixture of Multi-Scale Expert Transformers for Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/404

Markdown

[Zhang et al. "AdaMixT: Adaptive Weighted Mixture of Multi-Scale Expert Transformers for Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/zhang2025ijcai-adamixt/) doi:10.24963/IJCAI.2025/404

BibTeX

@inproceedings{zhang2025ijcai-adamixt,
  title     = {{AdaMixT: Adaptive Weighted Mixture of Multi-Scale Expert Transformers for Time Series Forecasting}},
  author    = {Zhang, Huanyao and Lin, Jiaye and Zhang, Wentao and Yuan, Haitao and Li, Guoliang},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {3633-3641},
  doi       = {10.24963/IJCAI.2025/404},
  url       = {https://mlanthology.org/ijcai/2025/zhang2025ijcai-adamixt/}
}