VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting
Abstract
Federated reinforcement learning (FRL) methods usually share the encrypted local state or policy information and help each client to learn from others while preserving everyone's privacy. In this work, we propose that sharing the approximated behavior metric-based state projection function is a promising way to enhance the performance of FRL and concurrently provides an effective protection of sensitive information. We introduce FedRAG, a FRL framework to learn a computationally practical projection function of states for each client and aggregating the parameters of projection functions at a central server. The FedRAG approach shares no sensitive task-specific information, yet provides information gain for each client. We conduct extensive experiments on the DeepMind Control Suite to demonstrate insightful results.
Cite
Text
Yang et al. "VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/590Markdown
[Yang et al. "VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/yang2024ijcai-vcformer/) doi:10.24963/ijcai.2024/590BibTeX
@inproceedings{yang2024ijcai-vcformer,
title = {{VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting}},
author = {Yang, Yingnan and Zhu, Qingling and Chen, Jianyong},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {5335-5343},
doi = {10.24963/ijcai.2024/590},
url = {https://mlanthology.org/ijcai/2024/yang2024ijcai-vcformer/}
}