Skip to content

Vortex-AI-Group/Awesome-Time-Series-Papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Awesome-Time-Series-Papers

🥰 Some Awesome Time Series Papers for Time Series Analysis.

Time Series Deep Learning Models

Title Paper Code
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting [paper] [code]
Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [paper] [code]
Pyraformer: Low-complexity Pyramidal Attention for Long-range Time Series Modeling and Forecasting [paper] [code]
FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [paper] [code]
Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting [paper] [code]
ESTformer: Transformer Utilizing Spatiotemporal Dependencies for Electroencaphalogram Super-resolution [paper] [code]
Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures [paper] [code]
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting [paper] [code]
Long-term Forecasting with TiDE: Time-series Dense Encoder [paper] [code]
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers [paper] [code]
Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting [paper] [code]
Are Transformers Effective for Time Series Forecasting? [paper] [code]
TSMixer: An All-MLP Architecture for Time Series Forecasting [paper] [code]
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [paper] [code]
TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting [paper] [code]
SAMformer: Unlocking the Potential of Transformers in Time Series Forecasting with Sharpness-Aware Minimization and Channel-Wise Attention [paper] [code]
TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables [paper] [code]

Time Series Multi-Task Model

Title Paper Code
TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [paper] [code]
One Fits All:Power General Time Series Analysis by Pretrained LM [paper] [code]
Peri-midFormer: Periodic Pyramid Transformer for Time Series Analysis [paper] [code]
UniTS: A Unified Multi-Task Time Series Model [paper] [code]
Mitigating Data Scarcity in Time Series Analysis: A Foundation Model with Series-Symbol Data Generation [paper] [code]

Large Language Models for Time Series Analysis

Title Paper Code
One Fits All:Power General Time Series Analysis by Pretrained LM [paper] [code]
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [paper] [code]
S2IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting [paper] [code]
A decoder-only foundation model for time-series forecasting [paper] [code]

Large Time Series Foundation Models

Title Paper Code
TimeGPT-1 [paper] [code]
Chronos: Learning the Language of Time Series [paper] [code]
Unified Training of Universal Time Series Forecasting Transformers [paper] [code]
Timer: Generative Pre-trained Transformers Are Large Time Series Models [paper] [code]
MOMENT: A Family of Open Time-series Foundation Models [paper] [code]
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts [paper] [code]
VisionTS: Visual Masked Autoencoders Are Free-Lunch Zero-Shot Time Series Forecasters [paper] [code]
Towards Neural Scaling Laws for Time Series Foundation Models [paper] [code]
Sundial: A Family of Highly Capable Time Series Foundation Models [paper] [code]

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors