Deep Frequency Derivative Learning for Non-Stationary Time Series Forecasting
Abstract
In a recently introduced model of successive committee elections, for a given set of ordinal or approval preferences one aims to find a sequence of a given length of “best” same-size committees such that each candidate is a member of a limited number of consecutive committees. However, the practical usability of this model remains limited, as the described task turns out to be NP-hard for most selection criteria already for seeking committees of size three. Non-trivial or somewhat efficient algorithms for these cases are lacking too. Motivated by a desire to unlock the full potential of the described temporal model of committee elections, we devise (parameterized) algorithms that effectively solve the mentioned hard cases in realistic scenarios of a moderate number of candidates or of a limited time horizon.
Cite
Text
Fan et al. "Deep Frequency Derivative Learning for Non-Stationary Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/436Markdown
[Fan et al. "Deep Frequency Derivative Learning for Non-Stationary Time Series Forecasting." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/fan2024ijcai-deep/) doi:10.24963/ijcai.2024/436BibTeX
@inproceedings{fan2024ijcai-deep,
title = {{Deep Frequency Derivative Learning for Non-Stationary Time Series Forecasting}},
author = {Fan, Wei and Yi, Kun and Ye, Hangting and Ning, Zhiyuan and Zhang, Qi and An, Ning},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {3944-3952},
doi = {10.24963/ijcai.2024/436},
url = {https://mlanthology.org/ijcai/2024/fan2024ijcai-deep/}
}