Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
7-2024
Abstract
Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models. The concept of universal forecasting, emerging from pre-training on a vast collection of time series datasets, envisions a single Large Time Series Model capable of addressing diverse downstream forecasting tasks. However, constructing such a model poses unique challenges specific to time series data: i) cross-frequency learning, ii) accommodating an arbitrary number of variates for multivariate time series, and iii) addressing the varying distributional properties inherent in large-scale data. To address these challenges, we present novel enhancements to the conventional time series Transformer architecture, resulting in our proposed Masked Encoder-based Universal Time Series Forecasting Transformer (Moirai). Trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains, Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
Keywords
Time series forecast, Deep learning, Time series transformer
Discipline
Artificial Intelligence and Robotics
Publication
Proceedings of the 41st International Conference on Machine Learning (ICML 2024) : Vienna, Austria, July 21-27
Volume
235
First Page
53140
Last Page
53164
Publisher
PMLR
City or Country
Vienna, Austria
Citation
WOO, Gerald; LIU, Chenghao; KUMAR, Akshat; XIONG, Caiming; SAVARESE, Silvio; and SAHOO, Doyen.
Unified training of universal time series forecasting transformers. (2024). Proceedings of the 41st International Conference on Machine Learning (ICML 2024) : Vienna, Austria, July 21-27. 235, 53140-53164.
Available at: https://ink.library.smu.edu.sg/sis_research/9906
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.