GithubHelp home page GithubHelp logo

benzickel / torch_arima Goto Github PK

View Code? Open in Web Editor NEW
4.0 2.0 0.0 9.5 MB

ARIMA time series implementation in PyTorch with optional support for Bayesian priors.

Python 100.00%
arima bayesian pyro pytorch varima

torch_arima's Introduction

torch_arima

ARIMA time series implementation in PyTorch supporting the following model types:

Model Type Location Description
ARIMA ARIMA.ARIMA torch.nn.Module with ARIMA polynomial coefficients as parameters and a forward method that converts observations to innovations and a predict method which converts innovations to observations.
VARIMA ARIMA.VARIMA Same as ARIMA.ARIMA with support for vector innovations and vector observations.
Bayesian ARIMA ARIMA.BayesianARIMA pyro.nn.PyroModule wrapper around ARIMA.ARIMA with support for priors to all polynomial coefficients and innovations distribution parameters.
Bayesian VARIMA ARIMA.BayesianVARIMA Same as ARIMA.VARIMA with support for vector innovations and vector observations.

Installation

For local package installation that enables modifying the package source code in place without reinstalling run

pip install -e .

Tests

python -m ARIMA

Examples

All the examples can be run at once by executing

python -m ARIMA.examples

This will create additional comparisons between the median predictions and confidence intervals of the MLE and Bayesian estimators.

ARIMA Examples

Maximum Likelihood Estimator

Utilizes torch optimizers in order to find the maximum likelihood estimator. Run by executing

python -m ARIMA.examples.mle

The below graphs will be created.

Bayesian Estimator

Utilizes pyro which is based on torch in order to find the Bayesian posterior. Run by executing

python -m ARIMA.examples.bayesian

The below graphs will be created.

The Bayesian estimator can also estimate missing samples that occur at arbitrary times.

The probabilty density of observing certain values for the missing samples given the known observed samples $P(X_{Missing}|X_{Observed})$ can also be calculated using the Bayesian estimator. This probabilty density can be used as a score in a K-Fold cross validation scheme, where different folds have different missing samples.

Comparison Between the Maximum Likelihood and Bayesian Estimators

It can be seen that the two estimators have different median predictions, and that as less observed data is available the MLE estimator becomes more confident in its predictions, whereas the Bayesian estimator becomes less confident in its predicitons, especially for the short term predictions.

Bayesian VARIMA Example

The example can be run by executing

python -m ARIMA.examples.mortality

The below graph shows predicted weekly death counts for males and females. The model captures annual periodic changes in mortality and correlations between female and male death counts.

Viewed as an yearly moving sum the COVID-19 effect on death counts can be viewed more clearly as annual periodic changes in mortality are averaged out.

The effect of COVID-19 on short term death count predictions can be visualized by comparing predictions of a model that did not observe death counts during the COVID-19 pandemic (a.k.a. Pre COVID model), to a model that observed the most up to date data available (a.k.a. Post COVID model).

The importance of using a VARIMA model, rather then a model comprised of two independent ARIMA models (a.k.a. Multiple ARIMA model), can be seen in the graph below where the confidence interval of the VARIMA model is much larger (as should be) than that of the Multiple ARIMA model, as it correctly captures the correlation between death counts of females and males.

Design

An ARIMA(p,d,q) time series is defined by the equation (courtesy of Wikipedia)

with $X_i$ being the observations, $\epsilon_i$ being the innovations, and $L$ is the lag operator.

The determinant of the Jacobian of the transformation from innovations to observations is equal to one since

$$\begin{align} \frac{\partial X_i}{\partial \epsilon_i} &= 1 \text{ for all } i \\\ \frac{\partial X_i}{\partial \epsilon_j} &= 0 \text{ for all } j > i \end{align}$$

This means that the ARIMA transformation can be viewed as a change of random variable from innovations to observations, in which the probability density of innovations is equal to the probability density of the observations, which is how the core of the ARIMA module is implemented in Transform.py.

torch_arima's People

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.