GithubHelp home page GithubHelp logo

liu3xing3long / yolo3_pytorch Goto Github PK

View Code? Open in Web Editor NEW

This project forked from zhanghanduo/yolo3_pytorch

0.0 2.0 0.0 314 KB

Yolo3 Implementation in Pytorch using COCO and BDD100K dataset

Python 100.00%

yolo3_pytorch's Introduction

YOLO version3 in Pytorch

Full implementation of YOLO version3 in PyTorch, including training, evaluation, simple deployment(developing).

Overview

YOLOv3: An Incremental Improvement

[Paper]
[Original Implementation]

Motivation

Implement YOLOv3 and darknet53 without original darknet cfg parser.
It is easy to custom your backbone network. Such as resnet, densenet...

Also decide to develop custom structure (like grayscale pretrained model)

Installation

Environment
  • pytorch >= 0.4.0
  • python >= 3.6.0
Get code
git clone https://github.com/zhanghanduo/yolo3_pytorch.git
cd YOLOv3_PyTorch
pip3 install -r requirements.txt --user
Download COCO dataset
cd data/
bash get_coco_dataset.sh
Download BDD dataset

Please visit BDD100K for details.

Training

Download pretrained weights
  1. See weights readme for detail.
  2. Download pretrained backbone wegiths from Google Drive or Baidu Drive
  3. Move downloaded file darknet53_weights_pytorch.pth to wegihts folder in this project.
Modify training parameters
  1. Review config file training/params.py
  2. Replace YOUR_WORKING_DIR to your working directory. Use for save model and tmp file.
  3. Adjust your GPU device. See parallels.
  4. Adjust other parameters.
Start training
cd training
python training.py params.py
Option: Visualizing training
#  please install tensorboard in first
python -m tensorboard.main --logdir=YOUR_WORKING_DIR   

Evaluate

Download pretrained weights
  1. See weights readme for detail.
  2. Download pretrained yolo3 full wegiths from Google Drive or Baidu Drive
  3. Move downloaded file yolov3_weights_pytorch.pth to wegihts folder in this project.
Start evaluate
cd evaluate
python eval.py params.py
Results
Model mAP (min. 50 IoU) weights file
YOLOv3 (paper) 57.9
YOLOv3 (convert from paper) 58.18 official_yolov3_weights_pytorch.pth
YOLOv3 (train best model) 59.66 yolov3_weights_pytorch.pth

Roadmap

  • Yolov3 training
  • Yolov3 evaluation
  • Add backbone network other than Darknet
  • Able to adapt 3-channel image to 1-channel input

Credit

@article{yolov3,
	title={YOLOv3: An Incremental Improvement},
	author={Redmon, Joseph and Farhadi, Ali},
	journal = {arXiv},
	year={2018}
}

Reference

yolo3_pytorch's People

Contributors

zhanghanduo avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.