Quantitative Claim-Centric Reasoning in Logic-Based Argumentation
Abstract
Large Language Models (LLMs) have recently shown promise in Time Series Forecasting (TSF) by effectively capturing intricate time-domain dependencies. However, our preliminary experiments reveal that standard LLM-based approaches often fail to capture global correlations, limiting predictive performance. We found that embedding frequency-domain signals smooths weight distributions and enhances structured correlations by clearly separating global trends (low-frequency components) from local variations (high-frequency components). Building on these insights, we propose FreqLLM, a novel framework that integrates frequency-domain semantic alignment into LLMs to refine prompts for improved time series analysis. By bridging the gap between frequency signals and textual embeddings, FreqLLM effectively captures multi-scale temporal patterns and provides more robust forecasting results. Extensive experiments on benchmark datasets demonstrate that FreqLLM outperforms state-of-the-art TSF methods in both accuracy and generalization. The code is available at https://github.com/biya0105/FreqLLM.
Cite
Text
Hecher et al. "Quantitative Claim-Centric Reasoning in Logic-Based Argumentation." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/377Markdown
[Hecher et al. "Quantitative Claim-Centric Reasoning in Logic-Based Argumentation." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/hecher2024ijcai-quantitative/) doi:10.24963/ijcai.2024/377BibTeX
@inproceedings{hecher2024ijcai-quantitative,
title = {{Quantitative Claim-Centric Reasoning in Logic-Based Argumentation}},
author = {Hecher, Markus and Mahmood, Yasir and Meier, Arne and Schmidt, Johannes},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {3404-3412},
doi = {10.24963/ijcai.2024/377},
url = {https://mlanthology.org/ijcai/2024/hecher2024ijcai-quantitative/}
}