LLMForecaster: Improving Seasonal Event Forecasts with Unstructured Textual Data
Abstract
Modern time-series forecasting models often fail to make full use of rich unstructured information about the time series themselves. This lack of proper conditioning can lead to “obvious" model failures; for example, models may be unaware of the details of a particular product, and hence fail to anticipate seasonal surges in customer demand in the lead up to major exogenous events like holidays for clearly relevant products. To address this shortcoming, this paper introduces a novel forecast post-processor — which we call LLMForecaster — that fine-tunes large language models (LLMs) to incorporate unstructured semantic and contextual information and historical data to improve the forecasts from an existing demand forecasting pipeline. In an industry-scale retail application, we demonstrate that our technique yields statistically significantly forecast improvements across several sets of products subject to holiday-driven demand surges.
Cite
Text
Zhang et al. "LLMForecaster: Improving Seasonal Event Forecasts with Unstructured Textual Data." NeurIPS 2024 Workshops: TSALM, 2024.Markdown
[Zhang et al. "LLMForecaster: Improving Seasonal Event Forecasts with Unstructured Textual Data." NeurIPS 2024 Workshops: TSALM, 2024.](https://mlanthology.org/neuripsw/2024/zhang2024neuripsw-llmforecaster/)BibTeX
@inproceedings{zhang2024neuripsw-llmforecaster,
title = {{LLMForecaster: Improving Seasonal Event Forecasts with Unstructured Textual Data}},
author = {Zhang, Hanyu and Arvin, Chuck and Efimov, Dmitry and Mahoney, Michael W. and Perrault-Joncas, Dominique and Ramasubramanian, Shankar and Wilson, Andrew Gordon and Wolff, Malcolm},
booktitle = {NeurIPS 2024 Workshops: TSALM},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/zhang2024neuripsw-llmforecaster/}
}