GithubHelp home page GithubHelp logo

kemuyiwuciguo / time-series-library Goto Github PK

View Code? Open in Web Editor NEW

This project forked from thuml/time-series-library

0.0 0.0 0.0 513 KB

A Library for Advanced Deep Time Series Models.

License: MIT License

Shell 47.45% Python 52.55%

time-series-library's Introduction

Time Series Library (TSlib)

TSlib is an open-source library for deep learning researchers, especially deep time series analysis.

We provide a neat code base to evaluate advanced deep time series models or develop your own model, which covers five mainstream tasks: long- and short-term forecasting, imputation, anomaly detection, and classification.

Leaderboard for Time Series Analysis

Till February 2023, the top three models for five different tasks are:

Model
Ranking
Long-term
Forecasting
Short-term
Forecasting
Imputation Anomaly
Detection
Classification
๐Ÿฅ‡ 1st TimesNet TimesNet TimesNet TimesNet TimesNet
๐Ÿฅˆ 2nd DLinear Non-stationary
Transformer
Non-stationary
Transformer
Non-stationary
Transformer
FEDformer
๐Ÿฅ‰ 3rd Non-stationary
Transformer
FEDformer Autoformer Informer Autoformer

Note: We will keep updating this leaderborad. If you have proposed advanced and awesome models, welcome to send your paper/code link to us or raise a pull request. We will add them to this repo and update the leaderborad as soon as possible.

Compared models of this leaderboard. โ˜‘ means that their codes have already been included in this repo.

  • TimesNet - TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [ICLR 2023] [Code]
  • DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code]
  • LightTS - Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures [arXiv 2022] [Code]
  • ETSformer - ETSformer: Exponential Smoothing Transformers for Time-series Forecasting [arXiv 2022] [Code]
  • Non-stationary Transformer - Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting [NeurIPS 2022] [Code]
  • FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [ICML 2022] [Code]
  • Pyraformer - Pyraformer: Low-complexity Pyramidal Attention for Long-range Time Series Modeling and Forecasting [ICLR 2022] [Code]
  • Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [NeurIPS 2021] [Code]
  • Informer - Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting [AAAI 2021] [Code]
  • Reformer - Reformer: The Efficient Transformer [ICLR 2020] [Code]
  • Transformer - Attention is All You Need [NeurIPS 2017] [Code]

See our latest paper [TimesNet] for the comprehensive benchmark. We will release a real-time updated online version in March.

Newly added baselines. We will add them into the leaderboard after a comprehensive evaluation.

  • PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. [ICLR 2023] [Code]

Usage

  1. Install Python 3.8. For convenience, execute the following command.
pip install -r requirements.txt
  1. Prepare Data. You can obtained the well pre-processed datasets from [Google Drive] or [Tsinghua Cloud]. Then place the downloaded data under the folder ./dataset. Here is a summary of supported datasets.

  1. Train and evaluate model. We provide the experiment scripts of all benchmarks under the folder ./scripts/. You can reproduce the experiment results as the following examples:
# long-term forecast
bash ./scripts/long_term_forecast/ETT_script/TimesNet_ETTh1.sh
# short-term forecast
bash ./scripts/short_term_forecast/TimesNet_M4.sh
# imputation
bash ./scripts/imputation/ETT_script/TimesNet_ETTh1.sh
# anomaly detection
bash ./scripts/anomaly_detection/PSM/TimesNet.sh
# classification
bash ./scripts/classification/TimesNet.sh
  1. Develop your own model.
  • Add the model file to the folder ./models. You can follow the ./models/Transformer.py.
  • Include the newly added model in the Exp_Basic.model_dict of ./exp/exp_basic.py.
  • Create the corresponding scripts under the folder ./scripts.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{wu2023timesnet,
  title={TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis},
  author={Haixu Wu and Tengge Hu and Yong Liu and Hang Zhou and Jianmin Wang and Mingsheng Long},
  booktitle={International Conference on Learning Representations},
  year={2023},
}

Contact

If you have any questions or suggestions, feel free to contact:

or describe it in Issues.

Acknowledgement

This library is constructed based on the following repos:

All the experiment datasets are public and we obtain them from the following links:

time-series-library's People

Contributors

wuhaixu2016 avatar htg17 avatar zdandsomsp avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.