GithubHelp home page GithubHelp logo

whynotw / yolo_metric Goto Github PK

View Code? Open in Web Editor NEW
37.0 37.0 11.0 8.19 MB

Calculate mean Average Precision (mAP) and confusion matrix for object detection models. Bounding box information for groundtruth and prediction is YOLO training dataset format.

License: Other

Makefile 0.28% C 74.87% C++ 4.95% Python 8.32% Cuda 11.59%

yolo_metric's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

yolo_metric's Issues

yolov4-tiny not detected objects

Hello! Can you please tell me, yolov4-tiny or yolov3-tiny are supported in your project? I can't detect any object. I use custom configs

I am getting this error when using on colab how can i fix it?

Try to load cfg: /content/gdrive/MyDrive/YOLO_metric/cfg/yolov4-custom.cfg, weights: /content/gdrive/MyDrive/YOLO_metric/data/mydata/darknet3_zoo/darknet_train_recycle09_run00/backup/yolov4-custom_best.weights, clear = 0
compute_capability = 370, cudnn_half = 0
layer filters size input output
0 Couldn't find activation function mish, going with ReLU
conv 32 3 x 3 / 1 416 x 416 x 3 -> 416 x 416 x 32 0.299 BF
1 Couldn't find activation function mish, going with ReLU
conv 64 3 x 3 / 2 416 x 416 x 32 -> 208 x 208 x 64 1.595 BF
2 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 208 x 208 x 64 -> 208 x 208 x 64 0.354 BF
3 route 1
4 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 208 x 208 x 64 -> 208 x 208 x 64 0.354 BF
5 Couldn't find activation function mish, going with ReLU
conv 32 1 x 1 / 1 208 x 208 x 64 -> 208 x 208 x 32 0.177 BF
6 Couldn't find activation function mish, going with ReLU
conv 64 3 x 3 / 1 208 x 208 x 32 -> 208 x 208 x 64 1.595 BF
7 Shortcut Layer: 4
8 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 208 x 208 x 64 -> 208 x 208 x 64 0.354 BF
9 route 8 2
10 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 208 x 208 x 128 -> 208 x 208 x 64 0.709 BF
11 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 2 208 x 208 x 64 -> 104 x 104 x 128 1.595 BF
12 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 104 x 104 x 128 -> 104 x 104 x 64 0.177 BF
13 route 11
14 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 104 x 104 x 128 -> 104 x 104 x 64 0.177 BF
15 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 104 x 104 x 64 -> 104 x 104 x 64 0.089 BF
16 Couldn't find activation function mish, going with ReLU
conv 64 3 x 3 / 1 104 x 104 x 64 -> 104 x 104 x 64 0.797 BF
17 Shortcut Layer: 14
18 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 104 x 104 x 64 -> 104 x 104 x 64 0.089 BF
19 Couldn't find activation function mish, going with ReLU
conv 64 3 x 3 / 1 104 x 104 x 64 -> 104 x 104 x 64 0.797 BF
20 Shortcut Layer: 17
21 Couldn't find activation function mish, going with ReLU
conv 64 1 x 1 / 1 104 x 104 x 64 -> 104 x 104 x 64 0.089 BF
22 route 21 12
23 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 104 x 104 x 128 -> 104 x 104 x 128 0.354 BF
24 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 2 104 x 104 x 128 -> 52 x 52 x 256 1.595 BF
25 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
26 route 24
27 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
28 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
29 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
30 Shortcut Layer: 27
31 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
32 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
33 Shortcut Layer: 30
34 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
35 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
36 Shortcut Layer: 33
37 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
38 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
39 Shortcut Layer: 36
40 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
41 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
42 Shortcut Layer: 39
43 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
44 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
45 Shortcut Layer: 42
46 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
47 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
48 Shortcut Layer: 45
49 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
50 Couldn't find activation function mish, going with ReLU
conv 128 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.797 BF
51 Shortcut Layer: 48
52 Couldn't find activation function mish, going with ReLU
conv 128 1 x 1 / 1 52 x 52 x 128 -> 52 x 52 x 128 0.089 BF
53 route 52 25
54 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 256 0.354 BF
55 Couldn't find activation function mish, going with ReLU
conv 512 3 x 3 / 2 52 x 52 x 256 -> 26 x 26 x 512 1.595 BF
56 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
57 route 55
58 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
59 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
60 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
61 Shortcut Layer: 58
62 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
63 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
64 Shortcut Layer: 61
65 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
66 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
67 Shortcut Layer: 64
68 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
69 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
70 Shortcut Layer: 67
71 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
72 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
73 Shortcut Layer: 70
74 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
75 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
76 Shortcut Layer: 73
77 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
78 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
79 Shortcut Layer: 76
80 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
81 Couldn't find activation function mish, going with ReLU
conv 256 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.797 BF
82 Shortcut Layer: 79
83 Couldn't find activation function mish, going with ReLU
conv 256 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 256 0.089 BF
84 route 83 56
85 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 512 0.354 BF
86 Couldn't find activation function mish, going with ReLU
conv 1024 3 x 3 / 2 26 x 26 x 512 -> 13 x 13 x1024 1.595 BF
87 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
88 route 86
89 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
90 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.089 BF
91 Couldn't find activation function mish, going with ReLU
conv 512 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.797 BF
92 Shortcut Layer: 89
93 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.089 BF
94 Couldn't find activation function mish, going with ReLU
conv 512 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.797 BF
95 Shortcut Layer: 92
96 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.089 BF
97 Couldn't find activation function mish, going with ReLU
conv 512 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.797 BF
98 Shortcut Layer: 95
99 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.089 BF
100 Couldn't find activation function mish, going with ReLU
conv 512 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.797 BF
101 Shortcut Layer: 98
102 Couldn't find activation function mish, going with ReLU
conv 512 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.089 BF
103 route 102 87
104 Couldn't find activation function mish, going with ReLU
conv 1024 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x1024 0.354 BF
105 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
106 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
107 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
108 max 5 x 5 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.002 BF
109 route 107
110 max 9 x 9 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.007 BF
111 route 107
112 max 13 x 13 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.015 BF
113 route 112 110 108 107
114 conv 512 1 x 1 / 1 13 x 13 x2048 -> 13 x 13 x 512 0.354 BF
115 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
116 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
117 conv 256 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 256 0.044 BF
118 upsample 2x 13 x 13 x 256 -> 26 x 26 x 256
119 route 85
120 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
121 route 120 118
122 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
123 conv 512 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 512 1.595 BF
124 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
125 conv 512 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 512 1.595 BF
126 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
127 conv 128 1 x 1 / 1 26 x 26 x 256 -> 26 x 26 x 128 0.044 BF
128 upsample 2x 26 x 26 x 128 -> 52 x 52 x 128
129 route 54
130 conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
131 route 130 128
132 conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
133 conv 256 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 256 1.595 BF
134 conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
135 conv 256 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 256 1.595 BF
136 conv 128 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 128 0.177 BF
137 conv 256 3 x 3 / 1 52 x 52 x 128 -> 52 x 52 x 256 1.595 BF
138 conv 21 1 x 1 / 1 52 x 52 x 256 -> 52 x 52 x 21 0.029 BF
139 yolo
Unused field: 'scale_x_y = 1.2'
Unused field: 'iou_thresh = 0.213'
Unused field: 'cls_normalizer = 1.0'
Unused field: 'iou_normalizer = 0.07'
Unused field: 'iou_loss = ciou'
Unused field: 'nms_kind = greedynms'
Unused field: 'beta_nms = 0.6'
Unused field: 'max_delta = 5'
140 route 136
141 conv 256 3 x 3 / 2 52 x 52 x 128 -> 26 x 26 x 256 0.399 BF
142 route 141 126
143 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
144 conv 512 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 512 1.595 BF
145 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
146 conv 512 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 512 1.595 BF
147 conv 256 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 256 0.177 BF
148 conv 512 3 x 3 / 1 26 x 26 x 256 -> 26 x 26 x 512 1.595 BF
149 conv 21 1 x 1 / 1 26 x 26 x 512 -> 26 x 26 x 21 0.015 BF
150 yolo
Unused field: 'scale_x_y = 1.1'
Unused field: 'iou_thresh = 0.213'
Unused field: 'cls_normalizer = 1.0'
Unused field: 'iou_normalizer = 0.07'
Unused field: 'iou_loss = ciou'
Unused field: 'nms_kind = greedynms'
Unused field: 'beta_nms = 0.6'
Unused field: 'max_delta = 5'
151 route 147
152 conv 512 3 x 3 / 2 26 x 26 x 256 -> 13 x 13 x 512 0.399 BF
153 route 152 116
154 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
155 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
156 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
157 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
158 conv 512 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 512 0.177 BF
159 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
160 conv 21 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 21 0.007 BF
161 yolo
Unused field: 'scale_x_y = 1.05'
Unused field: 'iou_thresh = 0.213'
Unused field: 'cls_normalizer = 1.0'
Unused field: 'iou_normalizer = 0.07'
Unused field: 'iou_loss = ciou'
Unused field: 'nms_kind = greedynms'
Unused field: 'beta_nms = 0.6'
Unused field: 'max_delta = 5'
Total BFLOPS 59.562
Allocate additional workspace_size = 99.68 MB
Loading weights from /content/gdrive/MyDrive/YOLO_metric/data/mydata/darknet3_zoo/darknet_train_recycle09_run00/backup/yolov4-custom_best.weights...
seen 64
CUDA status Error: file: ./src/convolutional_kernels.cu : () : line: 143 : build time: Sep 11 2021 - 11:21:20
CUDA Error: no kernel image is available for execution on the device
python3: : Unknown error -1373889909

Inference is taking longer duration

Dear Contributor,

I have used your repository to get a confusion matrix for the YOLOv4 OD model. However, when I executed this code "python modulized/save_label_as_yolo_format.py" it is taking more time to give the inference. Kindly advise what needs to be done to expedite the process.

TypeError: * wants int - metric_module.py

I wanted to use this method by calling 'python3 modulized/compare_simple.py', but I have got an error:
'TypeError: * wants int'
for file 'modulized/module/metric_module.py' in the line 234:
"content2 += "%*sPrediction\n"%(12+(len(content2)-10)/2,"")"
The problem can be that '12+(len(content2)-10)/2' is not necessarily an integer.
My solution is to rewrite line 234 to:
"content2 += "%*sPrediction\n"%(round(12+(len(content2)-10)/2),"")"

But there could be other solutions as well

Data setting

Hi @whynotw can you please tell me more elaborately about data setting on how to set if i have ground truth and prediction coordinates.. for two or more classes..

KeyError: b'category_name'

Hi, I have tried to use this method but encountered an error while running the modulized/save_label_as_yolo_format.py file.
Here I have an error in the line 128 saying: "KeyError: b'category_name'"
Line 128 is: 'label = DICT_CLASS[result[0]]'
This error is due to the fact that for some reasons result[0] is type of binary string, but in DICT_CLASS the elements are of type string.

The solution is to modify 'label = DICT_CLASS[result[0]]' to 'label = DICT_CLASS[result[0].decode()]'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.