GithubHelp home page GithubHelp logo

unit8co / darts Goto Github PK

View Code? Open in Web Editor NEW
7.3K 61.0 800.0 173.36 MB

A python library for user-friendly forecasting and anomaly detection on time series.

Home Page: https://unit8co.github.io/darts/

License: Apache License 2.0

Python 99.98% Dockerfile 0.02% Shell 0.01%
python time-series forecasting machine-learning deep-learning anomaly-detection data-science

darts's Introduction

Time Series Made Easy in Python

darts


PyPI version Conda Version Supported versions Docker Image Version (latest by date) GitHub Release Date GitHub Workflow Status Downloads Downloads codecov Code style: black Join the chat at https://gitter.im/u8darts/darts

Darts is a Python library for user-friendly forecasting and anomaly detection on time series. It contains a variety of models, from classics such as ARIMA to deep neural networks. The forecasting models can all be used in the same way, using fit() and predict() functions, similar to scikit-learn. The library also makes it easy to backtest models, combine the predictions of several models, and take external data into account. Darts supports both univariate and multivariate time series and models. The ML-based models can be trained on potentially large datasets containing multiple time series, and some of the models offer a rich support for probabilistic forecasting.

Darts also offers extensive anomaly detection capabilities. For instance, it is trivial to apply PyOD models on time series to obtain anomaly scores, or to wrap any of Darts forecasting or filtering models to obtain fully fledged anomaly detection models.

Documentation

High Level Introductions
Articles on Selected Topics

Quick Install

We recommend to first setup a clean Python environment for your project with Python 3.8+ using your favorite tool (conda, venv, virtualenv with or without virtualenvwrapper).

Once your environment is set up you can install darts using pip:

pip install darts

For more details you can refer to our installation instructions.

Example Usage

Forecasting

Create a TimeSeries object from a Pandas DataFrame, and split it in train/validation series:

import pandas as pd
from darts import TimeSeries

# Read a pandas DataFrame
df = pd.read_csv("AirPassengers.csv", delimiter=",")

# Create a TimeSeries, specifying the time and value columns
series = TimeSeries.from_dataframe(df, "Month", "#Passengers")

# Set aside the last 36 months as a validation series
train, val = series[:-36], series[-36:]

Fit an exponential smoothing model, and make a (probabilistic) prediction over the validation series' duration:

from darts.models import ExponentialSmoothing

model = ExponentialSmoothing()
model.fit(train)
prediction = model.predict(len(val), num_samples=1000)

Plot the median, 5th and 95th percentiles:

import matplotlib.pyplot as plt

series.plot()
prediction.plot(label="forecast", low_quantile=0.05, high_quantile=0.95)
plt.legend()
darts forecast example

Anomaly Detection

Load a multivariate series, trim it, keep 2 components, split train and validation sets:

from darts.datasets import ETTh2Dataset

series = ETTh2Dataset().load()[:10000][["MUFL", "LULL"]]
train, val = series.split_before(0.6)

Build a k-means anomaly scorer, train it on the train set and use it on the validation set to get anomaly scores:

from darts.ad import KMeansScorer

scorer = KMeansScorer(k=2, window=5)
scorer.fit(train)
anom_score = scorer.score(val)

Build a binary anomaly detector and train it over train scores, then use it over validation scores to get binary anomaly classification:

from darts.ad import QuantileDetector

detector = QuantileDetector(high_quantile=0.99)
detector.fit(scorer.score(train))
binary_anom = detector.detect(anom_score)

Plot (shifting and scaling some of the series to make everything appear on the same figure):

import matplotlib.pyplot as plt

series.plot()
(anom_score / 2. - 100).plot(label="computed anomaly score", c="orangered", lw=3)
(binary_anom * 45 - 150).plot(label="detected binary anomaly", lw=4)
darts anomaly detection example

Features

  • Forecasting Models: A large collection of forecasting models; from statistical models (such as ARIMA) to deep learning models (such as N-BEATS). See table of models below.

  • Anomaly Detection The darts.ad module contains a collection of anomaly scorers, detectors and aggregators, which can all be combined to detect anomalies in time series. It is easy to wrap any of Darts forecasting or filtering models to build a fully fledged anomaly detection model that compares predictions with actuals. The PyODScorer makes it trivial to use PyOD detectors on time series.

  • Multivariate Support: TimeSeries can be multivariate - i.e., contain multiple time-varying dimensions instead of a single scalar value. Many models can consume and produce multivariate series.

  • Multiple series training (global models): All machine learning based models (incl. all neural networks) support being trained on multiple (potentially multivariate) series. This can scale to large datasets too.

  • Probabilistic Support: TimeSeries objects can (optionally) represent stochastic time series; this can for instance be used to get confidence intervals, and many models support different flavours of probabilistic forecasting (such as estimating parametric distributions or quantiles). Some anomaly detection scorers are also able to exploit these predictive distributions.

  • Past and Future Covariates support: Many models in Darts support past-observed and/or future-known covariate (external data) time series as inputs for producing forecasts.

  • Static Covariates support: In addition to time-dependent data, TimeSeries can also contain static data for each dimension, which can be exploited by some models.

  • Hierarchical Reconciliation: Darts offers transformers to perform reconciliation. These can make the forecasts add up in a way that respects the underlying hierarchy.

  • Regression Models: It is possible to plug-in any scikit-learn compatible model to obtain forecasts as functions of lagged values of the target series and covariates.

  • Explainability: Darts has the ability to explain some forecasting models using Shap values.

  • Data processing: Tools to easily apply (and revert) common transformations on time series data (scaling, filling missing values, differencing, boxcox, ...)

  • Metrics: A variety of metrics for evaluating time series' goodness of fit; from R2-scores to Mean Absolute Scaled Error.

  • Backtesting: Utilities for simulating historical forecasts, using moving time windows.

  • PyTorch Lightning Support: All deep learning models are implemented using PyTorch Lightning, supporting among other things custom callbacks, GPUs/TPUs training and custom trainers.

  • Filtering Models: Darts offers three filtering models: KalmanFilter, GaussianProcessFilter, and MovingAverageFilter, which allow to filter time series, and in some cases obtain probabilistic inferences of the underlying states/values.

  • Datasets The darts.datasets submodule contains some popular time series datasets for rapid and reproducible experimentation.

Forecasting Models

Here's a breakdown of the forecasting models currently implemented in Darts. We are constantly working on bringing more models and features.

Model Sources Target Series Support:

Univariate/
Multivariate
Covariates Support:

Past-observed/
Future-known/
Static
Probabilistic Forecasting:

Sampled/
Distribution Parameters
Training & Forecasting on Multiple Series
Baseline Models
(LocalForecastingModel)
NaiveMean 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
NaiveSeasonal 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
NaiveDrift 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
NaiveMovingAverage 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
Statistical / Classic Models
(LocalForecastingModel)
ARIMA 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
VARIMA πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
AutoARIMA 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
StatsForecastAutoArima (faster AutoARIMA) Nixtla's statsforecast 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
ExponentialSmoothing 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
StatsforecastAutoETS Nixtla's statsforecast 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
StatsforecastAutoCES Nixtla's statsforecast 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
BATS and TBATS TBATS paper 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
Theta and FourTheta Theta & 4 Theta 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
StatsForecastAutoTheta Nixtla's statsforecast 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
Prophet Prophet repo 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
FFT (Fast Fourier Transform) 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
KalmanForecaster using the Kalman filter and N4SID for system identification N4SID paper 🟩 🟩 πŸŸ₯ 🟩 πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯
Croston method 🟩 πŸŸ₯ πŸŸ₯ 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯
Global Baseline Models
(GlobalForecastingModel)
GlobalNaiveAggregate 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩
GlobalNaiveDrift 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩
GlobalNaiveSeasonal 🟩 🟩 πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ πŸŸ₯ 🟩
Regression Models
(GlobalForecastingModel)
RegressionModel: generic wrapper around any sklearn regression model 🟩 🟩 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩
LinearRegressionModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
RandomForest 🟩 🟩 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩
LightGBMModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
XGBModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
CatBoostModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
PyTorch (Lightning)-based Models
(GlobalForecastingModel)
RNNModel (incl. LSTM and GRU); equivalent to DeepAR in its probabilistic version DeepAR paper 🟩 🟩 πŸŸ₯ 🟩 πŸŸ₯ 🟩 🟩 🟩
BlockRNNModel (incl. LSTM and GRU) 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩 🟩 🟩
NBEATSModel N-BEATS paper 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩 🟩 🟩
NHiTSModel N-HiTS paper 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩 🟩 🟩
TCNModel TCN paper, DeepTCN paper, blog post 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩 🟩 🟩
TransformerModel 🟩 🟩 🟩 πŸŸ₯ πŸŸ₯ 🟩 🟩 🟩
TFTModel (Temporal Fusion Transformer) TFT paper, PyTorch Forecasting 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
DLinearModel DLinear paper 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
NLinearModel NLinear paper 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
TiDEModel TiDE paper 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
TSMixerModel TSMixer paper, PyTorch Implementation 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
Ensemble Models
(GlobalForecastingModel): Model support is dependent on ensembled forecasting models and the ensemble model itself
NaiveEnsembleModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩
RegressionEnsembleModel 🟩 🟩 🟩 🟩 🟩 🟩 🟩 🟩

Community & Contact

Anyone is welcome to join our Gitter room to ask questions, make proposals, discuss use-cases, and more. If you spot a bug or have suggestions, GitHub issues are also welcome.

If what you want to tell us is not suitable for Gitter or Github, feel free to send us an email at [email protected] for darts related matters or [email protected] for any other inquiries.

Contribute

The development is ongoing, and we welcome suggestions, pull requests and issues on GitHub. All contributors will be acknowledged on the change log page.

Before working on a contribution (a new feature or a fix), check our contribution guidelines.

Citation

If you are using Darts in your scientific work, we would appreciate citations to the following JMLR paper.

Darts: User-Friendly Modern Machine Learning for Time Series

Bibtex entry:

@article{JMLR:v23:21-1177,
  author  = {Julien Herzen and Francesco LÀssig and Samuele Giuliano Piazzetta and Thomas Neuer and Léo Tafti and Guillaume Raille and Tomas Van Pottelbergh and Marek Pasieka and Andrzej Skrodzki and Nicolas Huguenin and Maxime Dumonal and Jan KoΓ…β€Ίcisz and Dennis Bader and Frédérick Gusset and Mounir Benheddi and Camila Williamson and Michal Kosinski and Matej Petrik and Gaël Grosch},
  title   = {Darts: User-Friendly Modern Machine Learning for Time Series},
  journal = {Journal of Machine Learning Research},
  year    = {2022},
  volume  = {23},
  number  = {124},
  pages   = {1-6},
  url     = {http://jmlr.org/papers/v23/21-1177.html}
}

darts's People

Contributors

borda avatar brunnedu avatar camilaagw avatar davidkleindienst avatar dennisbader avatar droxef avatar dumjax avatar eliane-maalouf avatar endrjuskr avatar felixdivo avatar gdevos010 avatar gnwhr avatar guillaumeraille avatar hrzn avatar incubatorshokuhou avatar inverniz avatar kostiiii avatar leotafti avatar mabilton avatar madtoinou avatar mkos avatar mounben avatar pennfranc avatar piaz97 avatar radujica avatar rijkvandermeulen avatar themp avatar tneuer avatar tomasvanpottelbergh avatar xfiderek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

darts's Issues

ARIMA / AutoARIMA : Support exogenous variables

Is your feature request related to a current problem? Please describe.
Nope

Describe proposed solution
I LOVE this package and think it's very well-written! I'd love to add covariates for forecasting like in some other packages to go from ARIMA to ARIMAX. I'm assuming this would be an easy addition since I've seen it done elsewhere. Thanks!

Describe potential alternatives
A clear and concise description of any alternative solutions or existing features that might solve it.

Additional context
Add any other context or screenshots about the feature request here.

Transformer Forecasting Model

With the recent popularity of GPT3, it got me thinking about how a Transformer model could be applied to time-serires forecasting.
Could this sort of model be put on the 'wishlist' for Darts?

There was a discussion on Kaggle recently about this and it links to more information about the suject.
https://www.kaggle.com/c/m5-forecasting-accuracy/discussion/142833

I believe the Attention mechanisim of the model is capable of discovering certain patterns that other models struggle with and this could be a better model for certain domains of time-series forecasting.

There is a pytorch implementation here:
https://github.com/maxjcohen/transformer

a recent paper on the subject:
Deep Transformer Models for Time Series Forecasting:The Influenza Prevalence Case
https://arxiv.org/pdf/2001.08317

`backtest()`: Retrain every n steps

Recently found your library and it is really great and handy!
The evaluation of the forecasting performance is best done using a cross-validation approach. The backtest_forecasting()-function does that - although it currently iterates and re-trains the model on every single time step. In my application, I am training ten-thousands of different time series and it becomes computationally unfeasible to retrain on every time step. Another approach already implemented in scikit-learn is TimeSeriesSplit(), which generates a k-fold cross-validation split designed for time series.
https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.TimeSeriesSplit.html

One solution to the issue would be to add support for different cross-validation techniques implemented in scikit-learn (such as TimeSeriesSplit()). Another solution I see could be to add a stride parameter in backtest_forecasting() that specifies how many time steps it should skip while backtesting.

ForecastingModel.backtest() fails if forecast_horizon isn't provided

if hasattr(n, 'forecast_horizon'):
raise_if_not(n.forecast_horizon > 0, 'The provided forecasting horizon must be a positive integer.', logger)
# check start parameter
if hasattr(n, 'start'):
if isinstance(n.start, float):
raise_if_not(n.start >= 0.0 and n.start < 1.0, '`start` should be between 0.0 and 1.0.', logger)
elif isinstance(n.start, pd.Timestamp):
if (hasattr(n, 'trim_to_series') and n.trim_to_series) or not hasattr(n, 'trim_to_series'):
raise_if_not(n.start + training_series.freq() * n.forecast_horizon in training_series,

n.forecast_horizon at line 157 fails when hasattr(n, 'forecast_horizon') == false, ie. every time no explicit value is passed when calling backtest()

Error produced:

AttributeError: 'types.SimpleNamespace' object has no attribute 'forecast_horizon'

Update model?

Is your feature request related to a current problem? Please describe.
Is it possible to update an existing model after it has been trained? For example, a TCN model?

Additional context
As new training data arrives, I would like to be able to pickup the training from the last time.

Thanks for any advice

BTW, this package is really nice! So easy to use and intuitive :)

Latest pmdarima (1.7.1) requires statsmodels (>=0.11,<0.12)

Latest pmdarima (1.7.1) requires statsmodels>=0.11,<0.12(latest), because latest statsmodels seems to have bugs.

If users follow the official instructions for setting up Docker, they will end up with the latest statsmodels(0.12) installed.
So, should we fix requirements/main.txt like this?

- statsmodels>=0.11.1
+ statsmodels>=0.11.1, <0.12 

https://github.com/unit8co/darts/blob/master/requirements/main.txt

I would like to get your opinion. Thanks!

Conda-forge recipe request

Is your feature request related to a current problem? Please describe.
I'd like to be able to conda install darts

Describe proposed solution
Create a conda-forge recipe and link it to the PyPi package or GitHub release.

Describe potential alternatives
Pip installing to a conda environment works but PyPi and conda packages don't always play nice together when upgrading.

[BUG] Unittest may fail depending on the version of pmdarima and statsmodels

Describe the bug

Unittest may fail depending on the version of pmdarima and statsmodels.

System (please complete the following information)

  • Python version : 3.8.2
  • darts version : 0.2.3
  • OS : Ubuntu 20.04.1 LTS (Focal Fossa)

To Reproduce

I tried 3 combinations of the following.

pmdarima statsmodels results
1.6.1(not latest) 0.11.1(not latest) OK
1.6.1(not latest) 0.12.0(latest) test_autoregression_models.py fails because of Exponential smoothing model
1.7.1(latest) 0.11.1(not latest) test_autoregression_models.py fails because of Auto-ARIMA model

I didn't try the case that both libraries are latest version (pmdarima 1.7.1, statsmodels 0.12) because pmdarima 1.7.1 requires statsmodels<0.12. Please see also #194.

Suggention & Question

This error is likely caused by the threshold in this section.
Does anyone know how this threshold is determined?
Also, do you think it is reasonable to change the thresholds so that no errors occur?
I would like to get your opinions.

log details

pmdarima==1.6.1(not latest) statsmodels==0.11.1(not latest)

MY_DIR/darts$ python3 -m pip install -q pmdarima==1.6.1 statsmodels==0.11.1
MY_DIR/darts$ python3 -m unittest
./home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
loading checkpoint_3.pth.tar
....loading checkpoint_3.pth.tar
......../home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/.local/lib/python3.8/site-packages/Cython/Build/Inline.py:4: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
  import imp
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/statespace/sarimax.py:975: UserWarning: Non-invertible starting MA parameters found. Using zeros as starting parameters.
  warn('Non-invertible starting MA parameters found.'
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:568: FutureWarning: fft=True will become the default in a future version of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
./home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/git/darts/darts/models/theta.py:108: RuntimeWarning: invalid value encountered in double_scalars
  drift = self.coef * np.array([i + (1 - (1 - self.alpha) ** self.length) / self.alpha for i in range(0, n)])
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:568: FutureWarning: fft=True will become the default in a future version of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
./home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 201.50it/s]
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 179.40it/s]
.../home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:568: FutureWarning: fft=True will become the default in a future version of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
................................................/home/norihitoishida/git/darts/darts/tests/test_timeseries.py:410: DeprecationWarning: The default dtype for empty Series will be 'object' instead of 'float64' in a future version. Specify a dtype explicitly to silence this warning.
  TimeSeries.from_series(pd.Series(), freq='D')
................................................
----------------------------------------------------------------------
Ran 118 tests in 37.855s

OK

pmdarima==1.6.1(not latest) statsmodels==0.12.0(latest)

MY_DIR/darts$ python3 -m pip install -q pmdarima==1.6.1 statsmodels==0.12.0
MY_DIR/darts$ python3 -m unittest
./home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
loading checkpoint_3.pth.tar
....loading checkpoint_3.pth.tar
......../home/norihitoishida/git/darts/darts/models/exponential_smoothing.py:62: FutureWarning: the 'damped'' keyword is deprecated, use 'damped_trend' instead
  hw_model = hw.ExponentialSmoothing(series.values(),
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters/model.py:427: FutureWarning: After 0.13 initialization must be handled at model creation
  warnings.warn(
F/home/norihitoishida/.local/lib/python3.8/site-packages/Cython/Build/Inline.py:4: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
  import imp
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/arima_model.py:472: FutureWarning: 
statsmodels.tsa.arima_model.ARMA and statsmodels.tsa.arima_model.ARIMA have
been deprecated in favor of statsmodels.tsa.arima.model.ARIMA (note the .
between arima and model) and
statsmodels.tsa.SARIMAX. These will be removed after the 0.12 release.

statsmodels.tsa.arima.model.ARIMA makes use of the statespace framework and
is both well tested and maintained.

To silence this warning and continue using ARMA and ARIMA until they are
removed, use:

import warnings
warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARMA',
                        FutureWarning)
warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARIMA',
                        FutureWarning)

  warnings.warn(ARIMA_DEPRECATION_WARN, FutureWarning)
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters/model.py:427: FutureWarning: After 0.13 initialization must be handled at model creation
  warnings.warn(
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:662: FutureWarning: fft=True will become the default after the release of the 0.12 release of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
./home/norihitoishida/git/darts/darts/models/exponential_smoothing.py:62: FutureWarning: the 'damped'' keyword is deprecated, use 'damped_trend' instead
  hw_model = hw.ExponentialSmoothing(series.values(),
/home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 199.28it/s]
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 196.29it/s]
.../home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:662: FutureWarning: fft=True will become the default after the release of the 0.12 release of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
................................................/home/norihitoishida/git/darts/darts/tests/test_timeseries.py:410: DeprecationWarning: The default dtype for empty Series will be 'object' instead of 'float64' in a future version. Specify a dtype explicitly to silence this warning.
  TimeSeries.from_series(pd.Series(), freq='D')
................................................
======================================================================
FAIL: test_models_performance (darts.tests.test_autoregression_models.AutoregressionModelsTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/norihitoishida/git/darts/darts/tests/test_autoregression_models.py", line 59, in test_models_performance
    self.assertTrue(current_mape < max_mape, "{} model exceeded the maximum MAPE of {}."
AssertionError: False is not true : Exponential smoothing model exceeded the maximum MAPE of 4.8.with a MAPE of 5.5900253625464575

----------------------------------------------------------------------
Ran 118 tests in 39.151s

FAILED (failures=1)

pmdarima==1.7.1(latest) statsmodels==0.11.1(not latest)

MY_DIR/darts$ python3 -m pip install -q pmdarima==1.7.1 statsmodels==0.11.1
MY_DIR/darts$ python3 -m unittest
./home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
loading checkpoint_3.pth.tar
....loading checkpoint_3.pth.tar
......../home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/.local/lib/python3.8/site-packages/Cython/Build/Inline.py:4: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
  import imp
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/base/model.py:567: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
  warn("Maximum Likelihood optimization failed to converge. "
F/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/git/darts/darts/models/theta.py:108: RuntimeWarning: invalid value encountered in double_scalars
  drift = self.coef * np.array([i + (1 - (1 - self.alpha) ** self.length) / self.alpha for i in range(0, n)])
/home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:568: FutureWarning: fft=True will become the default in a future version of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
./home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/holtwinters.py:743: ConvergenceWarning: Optimization failed to converge. Check mle_retvals.
  warn("Optimization failed to converge. Check mle_retvals.",
/home/norihitoishida/.local/lib/python3.8/site-packages/torch/storage.py:34: FutureWarning: pickle support for Storage will be removed in 1.5. Use `torch.save` instead
  warnings.warn("pickle support for Storage will be removed in 1.5. Use `torch.save` instead", FutureWarning)
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 200.28it/s]
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 17/17 [00:00<00:00, 198.92it/s]
.../home/norihitoishida/.local/lib/python3.8/site-packages/statsmodels/tsa/stattools.py:568: FutureWarning: fft=True will become the default in a future version of statsmodels. To suppress this warning, explicitly set fft=False.
  warnings.warn(
................................................/home/norihitoishida/git/darts/darts/tests/test_timeseries.py:410: DeprecationWarning: The default dtype for empty Series will be 'object' instead of 'float64' in a future version. Specify a dtype explicitly to silence this warning.
  TimeSeries.from_series(pd.Series(), freq='D')
................................................
======================================================================
FAIL: test_models_performance (darts.tests.test_autoregression_models.AutoregressionModelsTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/norihitoishida/git/darts/darts/tests/test_autoregression_models.py", line 59, in test_models_performance
    self.assertTrue(current_mape < max_mape, "{} model exceeded the maximum MAPE of {}."
AssertionError: False is not true : Auto-ARIMA model exceeded the maximum MAPE of 13.7.with a MAPE of 30.32851040186142

----------------------------------------------------------------------
Ran 118 tests in 41.443s

FAILED (failures=1)

backtests' example: AttributeError: 'numpy.float64' object has no attribute 'width'

When I was trying to follow the article backtests' example

https://medium.com/unit8-machine-learning-publication/darts-time-series-made-easy-in-python-5ac2947a8878

I got this error

AttributeError                            Traceback (most recent call last)
<ipython-input-14-9213dab113d8> in <module>
      2 series.plot(label='data')
      3 for i, m in enumerate(models):
----> 4     err = mape(backtests[i], series)
      5     backtests[i].plot(lw=3, label='{}, MAPE={:.2f}%'.format(m, err))
      6 plt.title('Backtests with 3-months forecast horizon')

~/.pyenv/versions/3.8.3/envs/keras/lib/python3.8/site-packages/darts/metrics/metrics.py in wrapper_multivariate_support(*args, **kwargs)
     31         series2 = kwargs['series2'] if 'series2' in kwargs else args[0] if 'series1' in kwargs else args[1]
     32 
---> 33         raise_if_not(series1.width == series2.width, "The two TimeSeries instances must have the same width.", logger)
     34 
     35         num_series_in_args = int('series1' not in kwargs) + int('series2' not in kwargs)

AttributeError: 'numpy.float64' object has no attribute 'width'

To Reproduce

I used this part of the code

from darts.models import Prophet
models = [ExponentialSmoothing(), Prophet()]
backtests = [model.backtest(series,
                            start=pd.Timestamp('19550101'),
                            forecast_horizon=3)
             for model in models]

from darts.metrics import mape
series.plot(label='data')
for i, m in enumerate(models):
    err = mape(backtests[i], series)
    backtests[i].plot(lw=3, label='{}, MAPE={:.2f}%'.format(m, err))
plt.title('Backtests with 3-months forecast horizon')
plt.legend()

Expected behavior
I would like to get the article's plot and results.

System (please complete the following information):
u8darts==0.5.0
fbprophet==0.6
Python 3.8.3

Additional context

I don't know if this message is expected but in this part of the code

backtests = [model.backtest(series,
                            start=pd.Timestamp('19550101'),
                            forecast_horizon=3)
             for model in models]

I get this message many times

INFO:fbprophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.

Implement `map` method for TimeSeries class

Missing functionality
Although adding or multiplying a scalar value to every entry in a TimeSeries instance is very easy, when it comes to more complex element-wise operations (e.g. taking the sine function value of every entry in the series) there is no easy way to create a new TimeSeries instance, given a TimeSeries instance and operation.

Proposed solution
The most straight-forward way to implement this functionality would be implement a map function that takes as arguments a TimeSeries and a function, and returns a new TimeSeries instance (pretty much like the built-in python function).

seq2seq

Add seq2seq model to the library.

Installation failed: MAX_TORCH_SEED_VALUE too high and specific versions for pytorch and fbprophet needed

Install darts on windows

  1. Following the install guide leads to "ERROR: No matching distribution found for torch==1.5.1; extra == "all" (from u8darts[all])"
    You need to specify the versions:
    conda install -c conda-forge -c pytorch pip fbprophet==0.6 pytorch==1.5.1 cpuonly

  2. when fitting the model in "TCN-examples.ipynb" following error pops up:

File "C:\Users\JB\miniconda3\envs\darts2\lib\site-packages\darts\utils\torch.py", line 62, in decorator
    manual_seed(self._random_instance.randint(0, high=MAX_TORCH_SEED_VALUE))
  File "mtrand.pyx", line 743, in numpy.random.mtrand.RandomState.randint
  File "_bounded_integers.pyx", line 1343, in numpy.random._bounded_integers._rand_int32
ValueError: high is out of bounds for int32

I change coding in "...\site-packages\darts\utils\torch.py" in line 17:

#MAX_TORCH_SEED_VALUE = (1 << 63) - 1
MAX_TORCH_SEED_VALUE = 2147483647

After this change it works. It seems that results of model training is not affected.

System (please complete the following information):

  • Win 10
  • Python version: 3.7
  • darts version 0.4.0

Make setup lighter

Is your feature request related to a current problem? Please describe.
Currently the default setup of has ~2.4 GB of dependencies. This make it unusable for AWS lambdas or other resource constrained environments.

Describe proposed solution
Change the setup so that the heaviest dependencies can be optionally installed:
Eg:

pip install u8darts[core] # Bare minimun
pip install u8darts[torch] # ~1.5G (1,4G torch +74M caffe2 + (14+3.4)M tensorboard)
pip install u8darts[fbprophet] # ~500M (31M  fbprophet + 474M pystan)
pip install u8darts[statsmodels] # 42M 

[BUG] TqdmDeprecationWarning

When fitting a LSTM Model I get the folowing warning which I guess can be overcome by an update of the library. Nothing serious as everything runs so far. Just could be an issue in the 5.0 release of tqdm.

lib/python3.6/site-packages/darts/utils/utils.py:74: TqdmDeprecationWarning:

This function will be removed in tqdm==5.0.0
Please use tqdm.notebook.tqdm instead of tqdm.tqdm_notebook

  • Python version: [e.g. 3.6]
  • darts version [e.g. 0.2.1]

Accept Prophet with `growth='logistic'`

Proplem
To use Prophet with growth='logistic', the dataframe provided must have a column named 'cap'.

Describe proposed solution
Accept 'cap' and 'floor' as arrays provided as parametes on the model initialization.

Describe potential alternatives
Do not consider TimeSeries to be always multivariate and rename the columns in order.

u8darts on Torch v1.7

Is possible to use u8darts with torch v1.7 ?

I have latest torch installed, and when i tried to install 'u8darts[all]', then error message shows :

ERROR: Could not find a version that satisfies the requirement torch==1.5.1;

Regards

Plot in-sample forecast?

Hello, is there a way to plot in-sample forecast after having fitted a model?
The backtest() and historical_forecasts() are of course more rigorous for a proper backtesting but I would simply like sometimes to see how the model performs in sample before moving on to more sophisticate analysis.

Outlier detection

Problem.
Currently there is no functionality to spot outliers of specific entries within a given TimeSeries instance.

Proposed solution
One way to spot outliers could be to use one or more of the existing forecasting models (possibly as an ensemble) to produce a historic forecast, and look at the deviations of the forecasts from the actual data. Large deviations might be indicators for outliers.
Inspiration: https://towardsdatascience.com/anomaly-detection-with-time-series-forecasting-c34c6d04b24a

Use on data with regular breaks?

Hello - I recently discovered your library and it is very intriguing!

I frequently work with time series data which has regular breaks, during which there are no observed measurements (e.g., during the winter). However, when I try the darts library on this data, I get NaN MAPE error results, presumably due to these irregular intervals, as the forecasting methods don't know what to do with the gap periods.

I've tried using auto_fillna to fill the gaps for these time periods, which has successfully gotten the package to work, but I'm concerned that the errors and calibrations are biased, since the models are now being fit on inputed data, which makes up ~50% of observations in many cases. So when I try to compare models, half the resulting MAPE score is a function of which model is best at handling the inputed data.

Do you have a recommended workaround here? Any way to get the models to not score the time periods without actual observed data?

Thanks!

Prophet return every prediction value

Is your feature request related to a current problem? Please describe.
In order to create an outlier detection with Prophet, i need the full dataframe that's return Prophet

Describe proposed solution
Remove the hardcoded ["yhat"] from Prophet.predict add a variable asking to return just yhat or all the predictions: 'yhat_lower', 'yhat_upper', etc..
https://unit8co.github.io/darts/_modules/darts/models/prophet.html#Prophet.predict

Thanks very much! very useful framework

Documentation: comparison to sktime and tslearn

Is your feature request related to a current problem? Please describe.
There are many libraries for time-series forecasting. They have strong and weak points. It's hard to understand strong points and weak points of Darts if you are not a contributor of Darts.

Describe proposed solution
It would be good if authors describe use-cases where this library is the best choice. It would be good to compare strong and week points of Darts with two popular python libraries for time-series forecasting: https://github.com/alan-turing-institute/sktime and https://github.com/tslearn-team/tslearn (around 1.5k stars each).

`inverse_transform` and multivariate time series

DataTransformer.inverse_transform isn’t easy to use with multivariate time series in a β€œreal” use case situation. I think a very common use case would be to:

  1. transform the multivariate time series (e.g. re-scale using scaler)
  2. make a prediction for a single main target variable: say forecast holds that prediction
  3. inverse transform back to the original scale

For 3. I'd like to be able to call scaler.inverse_transform(forecast), but there's a dimension mismatch: inverse_transform() expects a multivariate time series here (in particular a multivariate time series of the same width it was fitted with). The scaler cannot know which variable(s) the forecast corresponds to.

There are workarounds, but I couldn't come up with a clean and easy way… Can someone think of one?

Add an example notebook for the FB Prophet model

Is your feature request related to a current problem? Please describe.
Currently, I couldn't see an example to demonstrate fbprophet usage through darts. I am opening this issue to create an example notebook for prophet usage.

Describe proposed solution
Adding a new jupyter notebook to examples folder to show the usage of fbprophet with different seasonal frequencies & holidays.

Visualisation with plotly / plotly express

Is there any way, how to do a visualisation of a forecast (and original data) with plotly or plotly express instead of matplotlib?

from darts.models import ExponentialSmoothing
train, val = series.split_before(pd.Timestamp('2020-05-01'))
model = ExponentialSmoothing()
model.fit(train)
prediction = model.predict(len(val))

Is there any way, how to store the forecast values to dataframe?

[BUG] ForecastingModel.backtest: Can bypass sanity checks

Describe the bug
Invalid optional arguments to ForecastingModel.backtest can bypass sanity checks when provided as positional arguments

To Reproduce

NaiveDrift().backtest(linear_series, None, start=0.7, forecast_horizon=-1)
NaiveDrift().backtest(linear_series, None, 0.7, -1)

Expected behavior
The first line above is caught by the sanity checks and raise ValueError: 'The provided forecasting horizon must be a positive integer.' as expected

For the second line however the negative forecast_horizon param clears the sanity checks and produces an error further down the line:

NaiveDrift().backtest(linear_series, None, 0.7, -1)
  File ".../darts/utils/utils.py", line 120, in sanitized_method
    return method_to_sanitize(self, *args, **kwargs)
  File ".../darts/models/forecasting_model.py", line 217, in backtest
    pred = self.predict(forecast_horizon, **predict_kwargs)
  File ".../darts/models/baselines.py", line 98, in predict
    forecast = np.linspace(last, last_value, num=n)
  File "<__array_function__ internals>", line 5, in linspace
  File ".../lib/python3.8/site-packages/numpy/core/function_base.py", line 115, in linspace
    raise ValueError("Number of samples, %s, must be non-negative." % num)
ValueError: Number of samples, -1, must be non-negative.

System (please complete the following information):

  • Python version: 3.8.5
  • darts version: develop branch, commit #2977f4f

Transform function for TimeSeries

Describe proposed solution
New function in TimeSeries class named transform taking as parameter function and applying it to TimeSeries itself. For reference see pandas-pipe

signature:
transform(self, function, *args, **kwargs) -> TimeSeries

[BUG] Inverse transform does not work on series of length < 3

First off, I want to say that I'm not an ML engineer at all. We have our own custom timeseries forecasting code that our ML engineers wrote. I stumbled upon darts, and we are very excited by the prospect of using it, given that we can use so many different kinds of models to generate our predictions. Our problem statement is as follows: We know the weekly consumption of our widgets by customers, and want to predict how many they will use next week. We only need 1 prediction. To that end, I wrote some code that cleans and preps the data, and is then passed to darts. Everything works as expected, until we hit the inverse_transform() function.

Describe the bug
We only care about 1 prediction step, so we invoke the prediction by saying: pred_series = my_model.predict(n=1).
This works, and I am able to get the scaled version of what the usage is going to be next week. However, I want the human-readable/understandable number, and so I do: print(transformer.inverse_transform(pred_series)). I then get the following error:

  File "train/run_models.py", line 124, in get_model
    print(transformer.inverse_transform(pred_series))
  File "/home/ec2-user/darts/lib/python3.7/site-packages/darts/preprocessing/scaler_wrapper.py", line 102, in inverse_transform
    reshape((-1, series.width))))
  File "/home/ec2-user/darts/lib/python3.7/site-packages/darts/timeseries.py", line 571, in from_times_and_values
    return TimeSeries(df, freq, fill_missing_dates)
  File "/home/ec2-user/darts/lib/python3.7/site-packages/darts/timeseries.py", line 58, in __init__
    'is not passed', logger)
  File "/home/ec2-user/darts/lib/python3.7/site-packages/darts/logging.py", line 54, in raise_if_not
    raise ValueError(message)

I read through the source code and didn't see any place in the inverse_transform() function where the frequency is being passed. Merely:

return TimeSeries.from_times_and_values(series.time_index(),
                                                self.transformer.inverse_transform(series.values().
                                                                                   reshape((-1, series.width))))

I then tried to pass the frequency argument as follows: print(transformer.inverse_transform(pred_series), "W-SUN"), and got this error instead:

  File "train/run_models.py", line 124, in get_model
    print(transformer.inverse_transform(pred_series, "W-SUN"))
TypeError: inverse_transform() takes 2 positional arguments but 3 were given

I put "W" instead of "W-SUN" too, with similar results (same error, that is).

So my question is, how to use darts to get precisely 1 prediction? And for it to be not scaled? I could generate 3, pick the first one and that'd be that, but I'm not sure if that's the best/right approach to this problem.

To Reproduce
Here's part of my code. Prepare any dataset of your choice ahead of this snippet, please.

    ......
    ......

    # Number of previous time stamps taken into account.
    SEQ_LENGTH = 2
    # Number of features in last hidden state
    HIDDEN_SIZE = 15 * SEQ_LENGTH
    # number of output time-steps to predict
    OUTPUT_LEN = 1
    # Number of stacked rnn layers.
    NUM_LAYERS = 2

    my_model = RNNModel(
        model='LSTM',
        output_length=OUTPUT_LEN,
        hidden_size=HIDDEN_SIZE,
        n_rnn_layers=NUM_LAYERS,
        input_length=SEQ_LENGTH,
        batch_size=100,
        n_epochs=150,
        model_name='Air_RNN', log_tensorboard=True
    )

    my_model.fit(train_transformed, val_transformed, verbose=True)

    pred_series = my_model.predict(n=3)

    backtest_series = backtest_forecasting(series_transformed, my_model, pd.Timestamp('20200621'),
                                       fcast_horizon_n=1, verbose=True)

    print('RMSE: {:.4f}'.format(rmse(transformer.inverse_transform(series_transformed),
                                 transformer.inverse_transform(backtest_series))))

    my_model.fit(series_transformed, verbose=True)

    pred_series = my_model.predict(n=1)

    print(pred_series)

    # Error is in this next line. Everything above this works like a charm
    print(transformer.inverse_transform(pred_series))

Expected behavior
I expect to see a single record with an inverse transformed value.

System (please complete the following information):

  • Python version: 3.7.7
  • darts version: 0.2.1

Additional context
Add any other context about the problem here.

In sample predictions

I would like to have in sample prediction as an option.

This will help collect model errors so users can diagnose models.

Describe proposed solution
An extra method to the model. predict_in_sample

in_sample_prediction = model.predict_in_sample(training_data)

Describe potential alternatives
Modifying predict method to be forecast and generalising predict like in Statsmodels ARIMA

ModuleNotFoundError: No module named 'darts'

I installed all the dependencies packages on anaconda (as well as fbprophet and torch), and later I installed
darts, by "pip install u8darts", saying that all "Rerquirement alread satisfied".

However, When I try to import dart on python, it doesnt find the module.

Does Anybody knows what might be the problem?

  • Python version: [e.g. 3.6.5]

image

Autoregressive forecasting

Nice work, guys!

It's a common place to use features generated from timeseries itself like lags, rolling statistics and etc.
I've seen some approaches like using sklearn transformers and pipelines for fitting, and forecasting via recursive procedure( iteratively generating one step ahead predictions )

Have you any ideas about it?
I think it would be nice feature.

LSTM Autoencoder with attention mechanisms

Problem
Adding LSTM-autoencoder with attention mechanisms for forecasting.

Solution
The solution is an implementation of the following paper. It will be possible to activate and deactivate both attention mechanisms.

LSTM

Add LSTM to the library

[Documentation] TimeSeries parameter: frequency- what is it?

I tried to create a TimeSeries object with either constructor or from_dataframe method, but I get:

ValueError: Could not infer explicit frequency. Observed frequencies: {'D', 'B'}. Is Series too short (n=2)?

Could you please describe in more details the frequency parameter:

  1. What are possible values?
  2. How this parameter is used?
  3. Example would be nice.
  4. Is it possible to pass data with irregular time values?
  5. Can I pass [1:N] for sequence of data, that are not bind to date? If so- what frequency should I use then?

Unified API : `model.params` attribute

Hello! Firstly, I wanted to thank you for this wonderful library.

Unified API to get model's params might make darts more convenient.
For example, to get params of Exponential Smoothing model, We have to code like this:

model = ExponentialSmoothing()
model.fit(train)

print(model.model.model.params) # {'smoothing_level': 0.5789473661331209, 'smoothing_slope': ...

So, I want to implement model.get_params() method like sklearn. model.params attribute with property like statsmodels and fbprophet.

I would like to get your opinion before send PR. Thanks!

  • Models
    • abstract params() function in the ForecastingModel superclass
    • ARIMA
    • AutoARIMA
    • Baseline Models
      • NaiveDrift
      • NaiveMean
      • NaiveSeasonal
    • ExponentialSmoothing
    • FFT
    • Prophet
    • StandardRegressionModel
    • TCNModel
    • Theta
    • TorchForecastingModel (RNNModel)
  • Preprocessing
    • Scaler wrapper

ConvergenceWarning: Optimization failed to converge. Check mle_retvals.

Describe the bug
Simple Exponential Smoothing does not converge for the air passenger data as shown in the demo.

To Reproduce
Please check my Colab notebook: https://colab.research.google.com/drive/1H8Dqyx6lfe818PeHy_qdVgmoN3Btoiam?usp=sharing

Expected behavior
Exponential smoothing should converge for this very simple dataset.

System (please complete the following information):

  • Python version: 3.8
  • darts version 0.5.0

Additional context
The autocorr() function doesn't seem to exist.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.