GithubHelp home page GithubHelp logo

mixaill76 / faster_coco_eval Goto Github PK

View Code? Open in Web Editor NEW
51.0 2.0 3.0 8.19 MB

Continuation of an abandoned project fast-coco-eval

Home Page: https://pypi.org/project/faster-coco-eval/

License: Apache License 2.0

C++ 19.58% Python 66.18% Makefile 0.62% C 6.75% Cython 6.86%
coco evaluation-metrics pycocotools python lvis evaluation objects365 pycoco-evaluation

faster_coco_eval's Introduction

Faster-COCO-Eval

PyPI PyPI Downloads

docs license

CI - Test

Disclaimer

I often use this project, but I saw it abandoned and without a public repository on github. Also, part of the project remained unfinished for a long time. I implemented some of the author's ideas and decided to make the results publicly available.

Install

Basic implementation identical to pycocotools

pip install faster-coco-eval

Additional visualization options

Only 1 additional package needed opencv-python-headless

pip install faster-coco-eval[extra]

Faster-COCO-Eval base

This package wraps a facebook C++ implementation of COCO-eval operations found in the pycocotools package. This implementation greatly speeds up the evaluation time for coco's AP metrics, especially when dealing with a high number of instances in an image.

Comparison

For our use case with a test dataset of 5000 images from the coco val dataset. Testing was carried out using the mmdetection framework and the eval_metric.py script. The indicators are presented below.

Visualization of testing colab_example.ipynb available in directory examples/comparison

Summary for 5000 imgs

Type faster-coco-eval pycocotools Profit
bbox 5.812 22.72 3.909
segm 7.413 24.434 3.296

Feautures

This library provides not only validation functions, but also error visualization functions. Including visualization of errors in the image.
You can study in more detail in the examples and Wiki.

Usage

Code examples for using the library are available on the Wiki

Examples

Update history

Available via link history.md

Star History

Star History Chart

License

The original module was licensed with apache 2, I will continue with the same license. Distributed under the apache version 2.0 license, see license for more information.

Citation

If you use this benchmark in your research, please cite this project.

@article{faster-coco-eval,
  title   = {{Faster-COCO-Eval}: Faster interpretation of the original COCOEval},
  author  = {MiXaiLL76},
  year    = {2024}
}

faster_coco_eval's People

Contributors

johannestheo avatar mixaill76 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

faster_coco_eval's Issues

Error when importing faster_eval_coco in python3.7.5

I am working with python3.7.5

Python 3.7.5
[GCC 7.3.0] :: Anaconda, Inc. on linux

I got an error when import faster_eval_coco,

>>> import faster_coco_eval
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/MY_PYTHONPATH/lib/python3.7/site-packages/faster_coco_eval/__init__.py", line 1, in <module>
    from .core.faster_eval_api import COCOeval_faster
  File "/MY_PYTHONPATH/lib/python3.7/site-packages/faster_coco_eval/core/faster_eval_api.py", line 9, in <module>
    from .cocoeval import COCOeval
  File "<fstring>", line 1
    (iouType=)
            ^

Add I reproduce the error when I try

>>> print(f"{math.pi=}")
  File "<fstring>", line 1
    (math.pi=)
            ^
SyntaxError: invalid syntax

>>> print(f"{math.pi}")
3.141592653589793

It seem python 3.7 does not support this kind of f-string f"{math.pi=}", Do you have plan to fix this issue?

@MiXaiLL76

I got an exception

Hi @MiXaiLL76 !

I got an exception on this line:

https://github.com/MiXaiLL76/faster_coco_eval/blob/611749b834f70d0e7969cd2af42e3ed1b76900d2/faster_coco_eval/core/faster_eval_api.py#L140C13-L140C13

bash self.ground_truth_orig_id = np.array(self.eval['ground_truth_orig_id']).reshape(self.ground_truth_shape) ValueError: cannot reshape array of size 39720 into shape (20,4,10,newaxis)

After that the rest calculated well. Could you explain what this block of code do and what I am doing wrong?

It seems that faster_coco_eval 1.3.1 has something that must be fixed.

first. faster_coco_eval/extra/curves.py in https://github.com/MiXaiLL76/faster_coco_eval/blob/9b064a2e79a96d5d1b26c221d3b9b323d5d87d71/faster_coco_eval/extra/curves.py#L363 should be erased "="
f"The calculation may not be accurate. No intersection of classes. {self.useCats=}" ---> f"The calculation may not be accurate. No intersection of classes. {self.useCats}"

The second. faster_coco_eval/core/cocoeval.py in
https://github.com/MiXaiLL76/faster_coco_eval/blob/9b064a2e79a96d5d1b26c221d3b9b323d5d87d71/faster_coco_eval/core/cocoeval.py#L577 should be erased "="
(f'{iouType=} not supported' --> f'{iouType} not supported')

Error when instanciating a Curves object

Describe the bug
I can't instanciate a Curves object.

To Reproduce

from faster_coco_eval import COCO
from faster_coco_eval.extra import Curves

cocoGt = COCO('gt.json')
cocoDt = cocoGt.loadRes('pred.json')

cur = Curves(cocoGt, cocoDt, iou_tresh=0.5, iouType='bbox')

KeyError Traceback (most recent call last)
Cell In[2], line 1
----> 1 cur = Curves(coco_gt, coco_gt, iou_tresh=0.5, iouType='bbox')

File ~/mambaforge/envs/fiftyone/lib/python3.10/site-packages/faster_coco_eval/extra/extra.py:33, in ExtraEval.init(self, cocoGt, cocoDt, iouType, min_score, iou_tresh, recall_count, useCats)
30 assert self.cocoGt is not None, "cocoGt is empty"
32 if (self.cocoGt is not None) and (self.cocoDt is not None):
---> 33 self.evaluate()

File ~/mambaforge/envs/fiftyone/lib/python3.10/site-packages/faster_coco_eval/extra/extra.py:52, in ExtraEval.evaluate(self)
48 cocoEval.params.useCats = int(self.useCats) # Выключение labels
50 self.cocoEval = cocoEval
---> 52 cocoEval.evaluate()
53 cocoEval.accumulate()
55 self.eval = cocoEval.eval

File ~/mambaforge/envs/fiftyone/lib/python3.10/site-packages/faster_coco_eval/core/faster_eval_api.py:54, in COCOeval_faster.evaluate(self)
52 elif p.iouType == "keypoints":
53 computeIoU = self.computeOks
---> 54 self.ious = {
55 (imgId, catId): computeIoU(imgId, catId)
56 for imgId in p.imgIds
57 for catId in catIds
58 } # bottleneck
...
--> 206 inds = np.argsort([-d["score"] for d in dt], kind="mergesort")
207 dt = [dt[i] for i in inds]
208 if len(dt) > p.maxDets[-1]:

KeyError: 'score'

Expected behavior
I expect this command to run without errors.

Additional context
The ground truths and annotation files are the same as in my previous issue (#19).

Error in accumulate: math_matches error: Assertion error in line 157

Describe the bug
cocoEval.accumulate() is not working. It returns this error:

 math_matches error: 
Traceback (most recent call last):
  File "/home/xxx/mambaforge/envs/fiftyone/lib/python3.10/site-packages/faster_coco_eval/core/faster_eval_api.py", line 157, in accumulate
    assert self.detection_matches.shape[1] == len(self.cocoDt.anns)
AssertionError

To Reproduce

from faster_coco_eval import COCO, COCOeval_faster
coco_gt = COCO('gt.json')
coco_dt = coco_gt.loadRes('preds.json')

cocoEval = COCOeval_faster(coco_gt, coco_dt, iouType='bbox')
cocoEval.evaluate()
cocoEval.accumulate()

Expected behavior
This command should work without errors. It does work correctly with pycocotools implementation.

Additional context
Could this happen because an image contains no ground truths ? Just an idea I'm throwing, not sure it's related.

Files
gt.json
preds.json

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.