Skip to content

uctb/TSFM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 

Repository files navigation

Time Series Foundation Model

This repo focus on progress on Time Series Foundation Model.

Survey&Benchmark

2024

  • A Survey of Deep Learning and Foundation Models for Time Series Forecasting. Miller(University of Georgia), John A., Mohammed Aldosari, Farah Saeed, Nasid Habib Barna, Subas Rana, I. Budak Arpinar, and Ninghao Liu. link πŸ”—18
  • Foundation Models for Time Series Analysis: A Tutorial and Survey. Yuxuan Liang(The Hong Kong University of Science and Technology(Guangzhou), Haomin Wen(Beijing Jiaotong University)) link πŸ”—26

Work

2024

  • Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts. Xiaoming Shi(princeton University), Shiyu Wang, Yuqi Nie, Dianqi Li, Zhou Ye, Qingsong Wen, and Ming Jin. link πŸ”—0 code
  • Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. Ekambaram, Vijay(IBM Granite), Arindam Jati, Nam H. Nguyen, Pankaj Dayama, Chandra Reddy, Wesley M. Gifford, and Jayant Kalagnanam. link πŸ”—7 code
  • Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting. Rasul, Kashif, Arjun Ashok, Andrew Robert Williams, Hena Ghonia, Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi et al. link πŸ”—15 code
  • Unified Training of Universal Time Series Forecasting Transformers. Woo, Gerald(Salesforce AI Research), Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, and Doyen Sahoo. link πŸ”—31 code
  • Chronos: Learning the Language of Time Series. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link πŸ”—46 code
  • Moment: A family of open time-series foundation models. Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski. link πŸ”—22 code
  • Timer: Generative Pre-trained Transformers Are Large Time Series Models. Yong Liu(Tsinghua University), Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long. link πŸ”—4 code

2023

  • Large Language Models Are Zero Shot Time Series Forecasters. Nate Gruver(NYU), Marc Finzi(CMU), Shikai Qiu(NYU), and Andrew G. Wilson(NYU). link πŸ”—174 code
  • A decoder-only foundation model for time-series forecasting. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link πŸ”—55 code

Dataset

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published