This is the origin Pytorch implementation of Informer in the following paper:
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Special thanks to Jieqi Peng
@cookieminions for building this repo.
Figure 1. The architecture of Informer.
- Python 3.6
- matplotlib == 3.1.1
- numpy == 1.19.4
- pandas == 0.25.1
- scikit_learn == 0.21.3
- torch == 1.4.0
Dependencies can be installed using the following command:
pip install -r requirements.txt
The ETT dataset used in the paper can be download in the repo ETDataset.
The required data files should be put into data/ETT/
folder. A demo slice of the ETT data is illustrated in the following figure. Note that the input of each dataset is zero-mean normalized in this implementation.
Figure 2. A demo of the ETT data.
Commands for training and testing the model with ProbSparse self-attention on Dataset ETTh1, ETTh2 and ETTm1 respectively:
# ETTh1
python -u main_informer.py --model informer --data ETTh1 --attn prob
# ETTh2
python -u main_informer.py --model informer --data ETTh2 --attn prob
# ETTm1
python -u main_informer.py --model informer --data ETTm1 --attn prob
More parameter information please refer to main_informer.py
.
Figure 3. Univariate forecasting results.
Figure 4. Multivariate forecasting results.
If you find this repository useful in your research, please consider citing the following paper:
@inproceedings{haoyietal-informer-2021,
author = {Haoyi Zhou and
Shanghang Zhang and
Jieqi Peng and
Shuai Zhang and
Jianxin Li and
Hui Xiong and
Wancai Zhang},
title = {Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting},
booktitle = {The Thirty-Fifth {AAAI} Conference on Artificial Intelligence, {AAAI} 2021},
pages = {online},
publisher = {{AAAI} Press},
year = {2021},
}