GithubHelp home page GithubHelp logo

sgt's Introduction

Sparse Graph Tracker (SGT)

PWC PWC PWC PWC

Official code for Sparse Graph Tracker (SGT) based on the Detectron2 framework. Please feel free to leave an ISSUE or send me an email ([email protected]).

News

  • (2022.10.11) Our paper is accepted WACV 2023! (arxiv paper will be updated soon)
  • (2022.10.06) Code and pretrained weights are released!

Installation

Dataset Setup

Model Zoo

  • Please modify the path of checkpoints in the config file based on your checkpoint directory

MOT17

Name Dataset HOTA MOTA IDF1 Download
SGT MOT17 58.2 73.2 70.2 model
SGT MOT17 + CrowdHuman 60.8 76.4 72.8 model

MOT20

Name Dataset HOTA MOTA IDF1 Download
SGT MOT20 51.6 64.5 62.7 model
SGT MOT20 + CrowdHuman 57.0 72.8 70.6 model

HiEve

Name Dataset MOTA IDF1 Download
SGT HiEve 47.2 53.7 model

How to run?

Train

python projects/SGT/train_net.py --config-file projects/SGT/configs/MOT17/sgt_dla34.yaml --data-dir /root/datasets --num-gpus 2 OUTPUT_DIR /root/sgt_output/mot17_val/dla34_mot17-CH

Inference

python projects/SGT/train_net.py --config-file projects/SGT/configs/MOT17/sgt_dla34.yaml --data-dir /root/datasets --num-gpus 1 --eval-only OUTPUT_DIR /root/sgt_output/mot17_test/dla34_mot17-CH

Visualization

## GT
python projects/Datasets/MOT/vis/vis_gt.py --data-root <$DATA_ROOT> --register-data-name <e.g., mot17_train> 
python projects/Datasets/MOT/vis/vis_gt.py --data-root <$DATA_ROOT> --register-data-name <e.g., mix_crowdhuman_train> --no-video-flag 


## model output
python projects/Datasets/MOT/vis/vis_seq_from_txt_result.py --data-root <$DATA_ROOT> --result-dir <$OUTPUT_DIR> --data-name {mot17, mot20, hieve, mot17_sub, mot20_sub} --tgt-split {val,test}

Motivation

image

Pipeline

image

MOT Benchmark Results

image

Ablation Experiment Results

image

image

Visualization

image

License

Code of SGT is licensed under the CC-BY-NC 4.0 license and free for research and academic purpose. SGT is based on the framework Detectron2 which is released under the Apache 2.0 license and the detector CenterNet which is released under the MIT license. This codebase also provides Detectron2 version of FairMOT which is released under the MIT license.

Citation

@inproceedings{hyun2023detection,
  title={Detection recovery in online multi-object tracking with sparse graph tracker},
  author={Hyun, Jeongseok and Kang, Myunggu and Wee, Dongyoon and Yeung, Dit-Yan},
  booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
  pages={4850--4859},
  year={2023}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.