GithubHelp home page GithubHelp logo

multiscale_hits's Introduction

Multiscale Hierarchical Time-Stepper (Multiscale HiTS)

Yuying Liu, J. Nathan Kutz, Steven L. Brunton, University of Washington.

The purpose of this repository is to help users reproduce the results shown in the paper "Hierarchical Deep Learning of Multiscale Differential Equation Time-Steppers".

Table of contents

Introduction

In this work, we consider deep learning models in the context of scientific computing. We train neural networks (NNs) to perform time-stepping. The core idea of the proposed method is to couple NNs trained over various time scales together so that to formulate a multiscale hierarchical time-stepper (multiscale HiTS), as shown below.

figure 1: method

Our scheme provides important advantages:

  • Highly accurate: time-steppers with small ∆t are responsible for the accurate time-stepping results over short periods, while the models with larger ∆t steps are used to ’reset’ the predictions over longer periods, preventing error accumulations from the short-time models.
  • Highly efficient: the computation is easy to vectorize.
  • Makes the training easier: each NN model only need to be trained over a short period, circumventing the exploding/vanishing gradient problem.
  • Super flexible: can be integrated with numerical time-steppers, resulting in hybrid time-steppers. The hybrid time-steppers are parallelizable in nature, which is in sharp contrast to numerical time-stepping algorithms that are usually serialized.

Check the paper for more details.

Structure

multiscale_HiTS/
    |- data/
        |- Linear/
        |- Hyperbolic/
        |- ...
        |- KS/
    |- src/
        |- ResNet.py
        |- utils.py
        |- rnn.py
        |- esn.py
        |- cwrnn.py
    |- scripts/
        |- multiscale_HiTS_exp/
            |- data_generation.ipynb
            |- model_training.ipynb
            |- ...
            |- multiscale_HiTS.ipynb
        |- sequence_generation_exp/
            |- Bach.ipynb
            |- ...
            |- seq_generations.ipynb
        |- others/
            |- motivating_example.ipynb
            |- visualize_increment.ipynb
    |- models/
        |- Linear/
        |- Hyperbolic/
        |- ...
        |- KS/
    |- results/
        |- Bach/
        |- Flower/
        |- Fluid/
        |- KS/
    |- figures/
    |- requirements.txt
    |- README.md
    |- LICENSE

Note 1: The folder names contained in data/ and models/ are the same: Bach/, Cubic/, Flower/, Fluid/, Hopf/, Hyperbolic/, KS/, Linear/, Lorenz/, VanDerPol/.

Note 2: We don't upload the data, models and results to Github as they are large files. However, you will be able to generate them by following the instructions in the scripts and below.

Getting started

We provide two ways for you to get started with this project. One is to use Google Colab and the other is to clone the repository and play with it locally.

Colab

If you want to quickly experiment with HiTS, we have written a Colab. It outlines the big idea of our proposed multiscale HiTS and doesn't require installing anything or intensive training. Linear differential equation for a harmonic oscillator is served as a toy example to help you go through the core of the codes. Figure 2 in the paper can be reproduced with it.

Setup

However, for those who want to dig a bit more of the project, we suggest you to clone the project and run the experiments locally. This work is mainly built on Python 3.7 and Pytorch. To set it up, we recommend you to create a virtual environment using Anaconda. Once you get it correctly installed, run

git clone https://github.com/luckystarufo/multiscale_HiTS.git
conda create -n <ENV_NAME> python=3.7
conda activate <ENV_NAME>
conda install pytorch torchvision -c pytorch
pip install -r requirements.txt

to restore the environment we use. If you are a Jupyter notebook user, run the following inside the environment:

pip install --user ipykernel
python -m ipykernel install --user --name=<ENV_NAME>

To allow tqdm (the progress bar library) to run in a notebook, you also need:

conda install -c conda-forge ipywidgets

Almost all codes are put in Jupyter Notebook (.ipynb) files to encourage exploration and modification of the work.

Results

Description

There are two key results shown in the paper.

  • The first one states that our multiscale HiTS outperforms any single time-scale neural network time-steppers in terms of accuracy while also maintaining good efficiency thanks to the vectorized computations of array programming. We also show that similar coupling technique can be applied to classic numerical time-steppers, resulting in hybrid time-steppers, accelerating classical numerical simulation algorithms. These results are illustrated by the following two figures in the paper.

figure 2: exp11

figure 3: exp12

  • The second result shows multiscale HiTS outperforms state-of-the-art RNN architectures, including LSTM, ESN and CW-RNN, over the sequence generation task, which is shown by the figure below. There's also a video demo for it. The sequences we explore include a simulated solution of the Kuramoto–Sivashinsky (KS) equation, a music excerpt from Bach’s Fugue No. 1 In C Major, BWV 846, a simulation of fluid flow past a circular cylinder at Reynolds number 100, and a video frame of blooming flowers, which you can find here.

figure 4: exp21

Instruction

All results can be reproduced with the help of the notebooks in scripts/, though you may need a GPU machine for training some of the neural networks.

  • For the first set of experiments, please use the scripts in multiscale_HiTS_exp/. You should first run data_generation.ipynb to generate the data sets then train the neural network time-steppers with model_training.ipynb. After that, you can run the other three scripts to reproduce Table 5 - 9, Figure 5, 6, 8, 9.
  • For the second set of experiments, please refer to sequence_generation_exp/. You will be able to get Table 1 and Figure 7 with the setup and training details documented in the Appendix of the paper.
  • There is also a others/ in the scripts/ folder. The two notebooks in it can be used to generate Figure 2 and 10. Note that one of the two notebooks, which is named motivating_example.ipynb, is pretty much the same with the one provided in Google Colab.

Happy coding :)

License

This project utilizes the MIT LICENSE. 100% open-source, feel free to utilize the code however you like.

Citation

@article{liu2020hierarchical,
  title={Hierarchical Deep Learning of Multiscale Differential Equation Time-Steppers},
  author={Liu, Yuying and Kutz, J Nathan and Brunton, Steven L},
  journal={arXiv preprint arXiv:2008.09768},
  year={2020}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.