GithubHelp home page GithubHelp logo

terrafin / iss-rnns Goto Github PK

View Code? Open in Web Editor NEW

This project forked from wenwei202/iss-rnns

0.0 0.0 0.0 9.01 MB

Sparse Recurrent Neural Networks -- Pruning Connections and Hidden Sizes (TensorFlow)

License: Apache License 2.0

Python 93.58% Shell 1.58% HTML 0.67% Jupyter Notebook 3.89% Starlark 0.28%

iss-rnns's Introduction

To duplicate, please use the exact tensorflow versions as mentioned.

About

This is TensorFlow implementation for training sparse LSTMs and other Recurrent Neural Networks. Related paper is publised in ICLR 2018: Learning Intrinsic Sparse Structures within Long Short-term Memory. Both structurally sparse LSTMs and non-structurally sparse LSTMs are supported by the code. The work on sparse CNNs is available here. Poster is here.

We use L1-norm regularization to obtain non-structurally sparse LSTMs. The effectiveness of L1-norm regularization is similar to connection pruning, which can significantly reduce parameters in LSTMs but the irregular pattern of non-zero weights may not be friendly for computation efficiency.

We use group Lasso regularization to obtain structurally sparse LSTMs. It can both reduce parameters in models and obtain regular nonzero weights for fast computation.

We proposed Intrinsic Sparse Structures (ISS) in LSTMs. By removing one component of ISS, we can simultaneously remove one hidden state, one cell state, one forget gate, one input gate, one output gate and one input update. In this way, we get a regular LSTM but with hidden size reduced by one. The method of learning ISS is based on group Lasso regularization. The ISS approach is also extended to Recurrent Highway Networks to learn the number of units per layer.

Examples

Stacked LSTMs

Code in ptb is stacked LSTMs for language modeling of Penn TreeBank dataset.

Recurrent Highway Networks

Code in rhns is ISS for Recurrent Highway Networks. ISS is proposed in LSTMs but can be easily extended to other recurrent neural networks like Recurrent Highway Networks.

Attention model

Code in bidaf is an attention+LSTM model for Question Answering of SQuAD dataset.

iss-rnns's People

Contributors

seominjoon avatar wenwei202 avatar jzilly avatar nealwu avatar flukeskywalker avatar shimisalant avatar aselle avatar anikem avatar kepingwang avatar mostrahmani avatar itsmeolivia avatar terrafin avatar tongda avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.