1. Gruver
N.
et al. (2024). "Large Language Models are Zero-Shot Time Series Forecasters." Advances in Neural Information Processing Systems 36 (NeurIPS 2023). Section 1
Paragraph 2 discusses the limitations of existing methods and the potential for large
pre-trained models (i.e.
foundational models) to be adapted for time-series forecasting tasks.
2. Zhou
T.
et al. (2023). "One Fits All: A Generative Foundation Model for Time Series Analysis." Proceedings of the 40th International Conference on Machine Learning
PMLR 202
pp. 41933-41951. The abstract and introduction explicitly propose a "foundation model for time series analysis" to handle diverse datasets and tasks
including forecasting.
3. Rasul
K.
et al. (2024). "Chronos: Learning the Language of Time Series." Amazon Science. This official publication introduces Chronos
a family of pre-trained foundational models specifically designed for time-series forecasting
demonstrating a direct industry application of this concept. Available at: https://www.amazon.science/publications/chronos-learning-the-language-of-time-series
4. Vaswani
A.
et al. (2017). "Attention is All You Need." Advances in Neural Information Processing Systems 30 (NIPS 2017). This paper introduced the Transformer architecture
which is the core technology behind most modern foundational models and has been successfully adapted from its original NLP context to time-series forecasting.