GithubHelp home page GithubHelp logo

francisyizhang / densenet Goto Github PK

View Code? Open in Web Editor NEW

This project forked from liuzhuang13/densenet

0.0 1.0 0.0 25 KB

Code for Densely Connected Convolutional Networks (DenseNets)

License: BSD 3-Clause "New" or "Revised" License

Lua 100.00%

densenet's Introduction

Densely Connected Convolutional Networks (DenseNets)

This repository contains the code for the paper Densely Connected Convolutional Networks (to appear on CVPR 2017 as an oral presentation).

The code is based on fb.resnet.torch.

Also, see

  1. Our Caffe Implementation
  2. Our space-efficient Torch Implementation.
  3. Our (much more) space-efficient Caffe Implementation.
  4. PyTorch Implementation (with BC structure) by Andreas Veit.
  5. PyTorch Implementation (with BC structure) by Brandon Amos.
  6. MXNet Implementation by Nicatio.
  7. MXNet Implementation (supporting ImageNet) by Xiong Lin.
  8. Tensorflow Implementation by Yixuan Li.
  9. Tensorflow Implementation by Laurent Mazare.
  10. Tensorflow Implementation (with BC structure) by Illarion Khlestov.
  11. Lasagne Implementation by Jan Schlüter.
  12. Keras Implementation by tdeboissiere.
  13. Keras Implementation by Roberto de Moura Estevão Filho.
  14. Keras Implementation (with BC structure) by Somshubra Majumdar.
  15. Chainer Implementation by Toshinori Hanya.
  16. Chainer Implementation by Yasunori Kudo.
  17. Fully Convolutional DenseNets for segmentation by Simon Jegou.

Note we didn't label all implementations which support BC structures.

If you find this helps your research, please consider citing:

@article{huang2016densely,
  title={Densely connected convolutional networks},
  author={Huang, Gao and Liu, Zhuang and Weinberger, Kilian Q and van der Maaten, Laurens},
  journal={arXiv preprint arXiv:1608.06993},
  year={2016}
}

Table of Contents

  1. Introduction
  2. Results
  3. Usage
  4. Contact

Introduction

DenseNet is a network architecture where each layer is directly connected to every other layer in a feed-forward fashion (within each dense block). For each layer, the feature maps of all preceding layers are treated as separate inputs whereas its own feature maps are passed on as inputs to all subsequent layers. This connectivity pattern yields state-of-the-art accuracies on CIFAR10/100 (with or without data augmentation) and SVHN.

Figure 1: A dense block with 5 layers and growth rate 4.

densenet Figure 2: A deep DenseNet with three dense blocks.

Results on CIFAR

The table below shows the results of DenseNets on CIFAR datasets. The "+" mark at the end denotes standard data augmentation (crop after zero-padding, and horizontal flip). For a DenseNet model, L denotes its depth and k denotes its growth rate. On CIFAR-10 and CIFAR-100 (without augmentation), Dropout with 0.2 drop rate is adopted.

Method Parameters CIFAR-10 CIFAR-10+ CIFAR-100 CIFAR-100+
DenseNet (L=40, k=12) 1.0M 7.00 5.24 27.55 24.42
DenseNet (L=100, k=12) 7.0M 5.77 4.10 23.79 20.20
DenseNet (L=100, k=24) 27.2M 5.83 3.74 23.42 19.25
DenseNet-BC (L=100, k=12) 0.8M 5.92 4.51 24.15 22.27
DenseNet-BC (L=250, k=24) 15.3M 5.19 3.62 19.64 17.60
DenseNet-BC (L=190, k=40) 25.6M - 3.46 - 17.18

ImageNet and Pretrained Models

Torch

The Torch models are trained under the same setting as in fb.resnet.torch. The error rates shown are 224x224 1-crop test errors.

Network Top-1 error Torch Model
DenseNet-121 (k=32) 25.0 Download (64.5MB)
DenseNet-169 (k=32) 23.6 Download (114.4MB)
DenseNet-201 (k=32) 22.5 Download (161.8MB)
DenseNet-161 (k=48) 22.2 Download (230.8MB)

Caffe

For ImageNet pretrained Caffe models, please see https://github.com/shicai/DenseNet-Caffe from @shicai. Also, we would like to thank @szq0214 for help on Caffe models.

PyTorch

In PyTorch, ImageNet pretrained models can be directly loaded by

import torchvision.models as models

densenet = models.densenet_161(pretrained=True)

For ImageNet training, customized models can be constructed by simply calling

DenseNet(growth_rate=32, block_config=(6, 12, 24, 16), num_init_features=64, bn_size=4, drop_rate=0, num_classes=1000).

See more details at http://pytorch.org/docs/torchvision/models.html?highlight=densenet and https://github.com/pytorch/vision/blob/master/torchvision/models/densenet.py.

We would like to thank @gpleiss for this nice work in PyTorch.

Usage

For training on CIFAR dataset,

  1. Install Torch ResNet (https://github.com/facebook/fb.resnet.torch) following the instructions there. To reduce memory consumption, we recommend to install the optnet package.
  2. Add the file densenet.lua to the folder models/.
  3. Change the learning rate schedule in the file train.lua: inside function learningRate(), change line 171/173 from decay = epoch >= 122 and 2 or epoch >= 81 and 1 or 0 to decay = epoch >= 225 and 2 or epoch >= 150 and 1 or 0
  4. Train a DenseNet-BC (L=100, k=12) on CIFAR-10+ using
th main.lua -netType densenet -depth 100 -dataset cifar10 -batchSize 64 -nEpochs 300 -optnet true

The file densenet-imagenet.lua is for training ImageNet models presented in the paper. The usage is very similar. Please refer to fb.resnet.torch for data preparation.

Note

On CIFAR, by default, the growth rate k is set to 12, bottleneck transformation is used, compression rate at transiton layers is 0.5, dropout is disabled. On ImageNet, the default model is densenet-121. To experiment with other settings, please change densenet.lua accordingly (see the comments in the code).

Updates

12/03/2016:

  1. Add Imagenet results and pretrained models.
  2. Add DenseNet-BC structures.

03/29/2017:

  1. Add the code for imagenet training.

04/20/2017:

  1. Add usage of models in PyTorch.

Contact

liuzhuangthu at gmail.com
gh349 at cornell.edu
Any discussions, suggestions and questions are welcome!

densenet's People

Contributors

liuzhuang13 avatar gaohuang avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.