A Decoder-only Foundation Model For Time-series Forecasting | Awesome LLM Papers Add your paper to Awesome LLM Papers

A Decoder-only Foundation Model For Time-series Forecasting

Abhimanyu Das, Weihao Kong, Rajat Sen, Yichen Zhou . Arxiv 2023 – 41 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Datasets Interdisciplinary Approaches Multimodal Semantic Representation Time Series

Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.

Similar Work