GithubHelp home page GithubHelp logo

hierarchical_cnn's Introduction

Hierarchical_CNN

Reference paper

Environment

  • torch : 1.8.2+cu111
  • torchvision : 0.9.2+cu111
  • torchmetrics : 0.10.0
  • numpy : 1.21.2
  • wandb : 0.13.2
pip install -r requirements.txt

Dataset

  • Cifar100
    • Use Superclass for Coarse label
    • Use Classes for Fine label
python preprocess.py --dataset cifar100 --data_dir "data_dir" --types train,test

Architecture

Architecture

  • Backbone : Wide_ResNet_50
    • Coarse : Using WRN50 Backbone expcet for layer4
    • Fine : Using whole WRN50 Backbone
  • Main Goal is training Fine Label Classification
  • Concatenating Coarse feature and Fine feature
  • Using 1x1conv layer for matching channel after concatenating
  • Simply Using Fc layer for Classifier

Training (in progress)

The following hyperparameters are not the best ones.
The model is still training.
  • Training w/ wandb.sweep
train.py --epochs 15 --batch_size 32 --lr 0.008 --backbone wide_resnet_50 --wandb --sweep --sweep_count 5
  • Training w/o wandb.sweep
python train.py --epochs 15 --batch_size 32 --lr 0.008 --backbone wide_resnet_50 --wandb --milestones 7,13
  • If you want to train only_fine model, just add --only_fine above command

Default Hyperparameter Setting

  • optimizer : SGD
    • or you can use Adam
  • batch_size : 32
  • coarse : 1, fine : 2
    • setting different Loss Weight
    • Coarse : Fine = 1 : 2
  • epochs : 15
  • lr : 8e-3
  • lr_scheduler : ReduceLROnPlateau
    • or you can use MultiStepLR with using milestones argument

optimizing model

  • Using wandb.sweep
  • Process
    1. Optimizing "only fine model" using wandb.sweep
      • batch_size :32
      • epochs : 15
      • lr : 0.0079
      • milestones : [7, 12]
    2. Traning "coarse+fine model" using above hyper parameter
  • Graph only fine vs. coarse+fine model graph
  • Accuracy Comparison (max)
    only_fine coarse+fine
    acc 0.787 0.7832
  • wandb report Link

To do

  • Find best parameter for Coarse+fine model
    • coarse and fine loss weight
    • lr
    • etc..
  • Change the layer to freeze on the Coarse Backbone
  • Apply other dataset with hierarchical label
    • Deep Fashion
    • etc..
  • Apply other Backbone
  • Apply different feature synthesis instead of concatenating

hierarchical_cnn's People

Contributors

ljh415 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.