GithubHelp home page GithubHelp logo

Comments (20)

banthungprong avatar banthungprong commented on May 30, 2024 1

Maybe you didn't pull the new version first? Still using the old one.
Like:
docker compose pull
and after:
docker compose up -d

from frigate.

blakeblackshear avatar blakeblackshear commented on May 30, 2024 1

Make sure you read the release notes. I can see at a glance that your config mapping needs to be updated. https://github.com/blakeblackshear/frigate/releases/tag/v0.13.0

from frigate.

NickM-27 avatar NickM-27 commented on May 30, 2024 1

The release notes covered this. You just need to follow the docs and change your config so a new model is generated for you https://docs.frigate.video/configuration/object_detectors#nvidia-tensorrt-detector

from frigate.

NickM-27 avatar NickM-27 commented on May 30, 2024 1

you'll need to change the env var, the model that is made by default is

model:
  path: /config/model_cache/tensorrt/yolov7-320.trt
  input_tensor: nchw
  input_pixel_format: rgb
  width: 320
  height: 320

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024 1

Randomly started generating other Yolo files very strange. Spent hours and hours to get these models to generate. I guess maybe the yolo file server may have been down

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

Ah don't think I've done the pull before. I'll give it ago tomorrow. Does 13 support tensorrt

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

Ah right, thanks for lettme know.

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

ok, I can't get parts this error now could someone try to point me in the correct direction?

model:
path: trt-models/yolov7-tiny-416.trt
input_tensor: nchw
input_pixel_format: rgb
width: 416
height: 416

version: '3.9'
services:
  frigate:
    container_name: frigate
    privileged: true # this may not be necessary for all setups
    restart: unless-stopped
    image: ghcr.io/blakeblackshear/frigate:stable-tensorrt #ghcr.io/blakeblackshear/frigate:0.13.0-tensorrt #ghcr.io/blakeblackshear/frigate:stable-tensorrt  #ghcr.io/blakeblackshear/frigate:0.12.1 #ghcr.io/blakeblackshear/frigate:stable # blakeblackshear/frigate:stable
    deploy:    # <------------- Add this section
      resources:
        reservations:
          devices:
            - driver: nvidia
              device_ids: ['0'] # this is only needed when using multiple GPUs
 
              capabilities: [gpu]
 # deploy:    # <------------- Add this section
    shm_size: "1000mb" # update for your cameras based on calculation
    devices:
      - /dev/nvidia0:/dev/nvidia0
      - /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
    volumes:
      - /etc/localtime:/etc/localtime:ro
      - /home/mike/frigate/:/config
      - /mnt/hubfiles:/media/frigate:rw
    #  - /home/mike/frigate:/db
      - /home/mike/frigate/trt-models:/trt-models
      - type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
        target: /tmp/cache
        tmpfs:
          size: 2000000000      
      
    ports:
      - "5000:5000"
      - "1935:1935"
      - "8554:8554"
    environment:
     # - FRIGATE_RTSP_PASSWORD: "1234"  #modify to whatever if using rtsp  
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=all

      - PLUS_API_KEY=f9be1e69-c600-4c0b-915b-489bb82cc4d2:c2fea8c97fc3784e6a7491338c21040b14ff9587
2024-02-14 19:23:28.867963808  [INFO] Preparing Frigate...
2024-02-14 19:23:28.883370091  [INFO] Starting Frigate...
2024-02-14 19:23:29.892125929  [2024-02-14 19:23:29] frigate.app                    INFO    : Starting Frigate (0.13.1-34fb1c2)
2024-02-14 19:23:29.979661484  [2024-02-14 19:23:29] peewee_migrate.logs            INFO    : Starting migrations
2024-02-14 19:23:29.982756594  [2024-02-14 19:23:29] peewee_migrate.logs            INFO    : There is nothing to migrate
2024-02-14 19:23:29.989213165  [2024-02-14 19:23:29] frigate.app                    INFO    : Recording process started: 456
2024-02-14 19:23:29.990784511  [2024-02-14 19:23:29] frigate.app                    INFO    : go2rtc process pid: 98
2024-02-14 19:23:30.015064899  [2024-02-14 19:23:30] frigate.app                    INFO    : Output process started: 468
2024-02-14 19:23:30.023115923  [2024-02-14 19:23:30] detector.tensorrt              INFO    : Starting detection process: 466
2024-02-14 19:23:30.092462146  Process detector:tensorrt:
2024-02-14 19:23:30.092949693  Traceback (most recent call last):
2024-02-14 19:23:30.093012611    File "/usr/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
2024-02-14 19:23:30.093014001      self.run()
2024-02-14 19:23:30.093061944    File "/usr/lib/python3.9/multiprocessing/process.py", line 108, in run
2024-02-14 19:23:30.093063251      self._target(*self._args, **self._kwargs)
2024-02-14 19:23:30.093081056    File "/opt/frigate/frigate/object_detection.py", line 102, in run_detector
2024-02-14 19:23:30.093082173      object_detector = LocalObjectDetector(detector_config=detector_config)
2024-02-14 19:23:30.093096096    File "/opt/frigate/frigate/object_detection.py", line 53, in __init__
2024-02-14 19:23:30.093097129      self.detect_api = create_detector(detector_config)
2024-02-14 19:23:30.093143749    File "/opt/frigate/frigate/detectors/__init__.py", line 18, in create_detector
2024-02-14 19:23:30.093144905      return api(detector_config)
2024-02-14 19:23:30.093160095    File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 220, in __init__
2024-02-14 19:23:30.093161100      self.engine = self._load_engine(detector_config.model.path)
2024-02-14 19:23:30.093172985    File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 88, in _load_engine
2024-02-14 19:23:30.093174011      with open(model_path, "rb") as f, trt.Runtime(self.trt_logger) as runtime:
2024-02-14 19:23:30.093199127  FileNotFoundError: [Errno 2] No such file or directory: 'trt-models/yolov7-tiny-416.trt'
2024-02-14 19:23:30.093262805  Exception ignored in: <function TensorRtDetector.__del__ at 0x7f32447383a0>
2024-02-14 19:23:30.093291673  Traceback (most recent call last):
2024-02-14 19:23:30.093315045    File "/opt/frigate/frigate/detectors/plugins/tensorrt.py", line 239, in __del__
2024-02-14 19:23:30.093448881      if self.outputs is not None:
2024-02-14 19:23:30.093490377  AttributeError: 'TensorRtDetector' object has no attribute 'outputs'
2024-02-14 19:23:30.094976190  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for driveway_cam: 476
2024-02-14 19:23:30.105362276  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for front_entry: 477
2024-02-14 19:23:30.105623282  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for front_deck: 478
2024-02-14 19:23:30.107023823  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for back_shed: 483
2024-02-14 19:23:30.107027606  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for sand_pit: 485
2024-02-14 19:23:30.107032833  [2024-02-14 19:23:30] frigate.app                    INFO    : Camera processor started for 360cam: 487
2024-02-14 19:23:30.115488028  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for driveway_cam: 492
2024-02-14 19:23:30.115493331  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for front_entry: 495
2024-02-14 19:23:30.115874027  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for front_deck: 501
2024-02-14 19:23:30.124408907  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for back_shed: 505
2024-02-14 19:23:30.134431675  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for sand_pit: 511
2024-02-14 19:23:30.143170598  [2024-02-14 19:23:30] frigate.app                    INFO    : Capture process started for 360cam: 516

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

I've temperaly disable the models and gpu config and just using my Coral and working good no errors yet, it seems lke everything is good using just coral, but I want to use my gpu as I believe it has better detection and I don't care about power.
So how should I be updating my model files DIR in config as it looks like my models config is the issue

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

Screenshot_20240214_200019_Chrome

Looks like the coral and gpu working together in this image, with out model library config. I didn't think it worked running both the coral and nvida gpu ?

from frigate.

NickM-27 avatar NickM-27 commented on May 30, 2024

you are using the coral for object detection and the gpu for decoding which is recommended

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

I'd prefer to use my gpu for object detection like I hade on 12.before moving to 13 and using tensor models library as it has worked better for my setup. The only info I can find after hours going through docs and issues forum is that the yolo libraries don't work on 13 and you need to do some changes or change dir in config I'm unsure.

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

OK so I would update my cofig
From : trt-models/yolov7-tiny-416.trt
To: /config/model_cache/yolov7-tiny-416.trt
If I'm correct it will automatically generate the
yolov7-tiny-416.trt model into the /config/model_cache dir is that correct or do I need to add the yolo file to that dir?

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

Thank you for your help, really appreciated. Just to be 100 percent sure i do this and understand correctly if I change the
/yolov7-320.trt. to another model that's avaliable frigate will automatically download the model I specified and put it into the config/model_cache/tensorrt/
Folder/DIR

from frigate.

NickM-27 avatar NickM-27 commented on May 30, 2024

You have to change the config and the environment variable

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

so Ive done this

 environment:
     # - FRIGATE_RTSP_PASSWORD: "1234"  #modify to whatever if using rtsp  
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=all
      - YOLO_MODELS=yolov7x-640

and

model:
  path: /config/model_cache/tensorrt/yolov7x-640.trt     #trt-models/yolov7-tiny-416.trt
  input_tensor: nchw
  input_pixel_format: rgb
  width: 416
  height: 416

but im getting this in error logs omg

frigate$ docker-compose logs
frigate  | s6-rc: info: service s6rc-fdholder: starting
frigate  | s6-rc: info: service s6rc-oneshot-runner: starting
frigate  | s6-rc: info: service s6rc-oneshot-runner successfully started
frigate  | s6-rc: info: service fix-attrs: starting
frigate  | s6-rc: info: service s6rc-fdholder successfully started
frigate  | s6-rc: info: service fix-attrs successfully started
frigate  | s6-rc: info: service legacy-cont-init: starting
frigate  | s6-rc: info: service legacy-cont-init successfully started
frigate  | s6-rc: info: service trt-model-prepare: starting
frigate  | s6-rc: info: service log-prepare: starting
frigate  | s6-rc: info: service log-prepare successfully started
frigate  | Generating the following TRT Models: yolov7x-640
frigate  | Downloading yolo weights
frigate  | s6-rc: info: service nginx-log: starting
frigate  | s6-rc: info: service go2rtc-log: starting
frigate  | s6-rc: info: service frigate-log: starting
frigate  | s6-rc: info: service nginx-log successfully started
frigate  | s6-rc: info: service go2rtc-log successfully started
frigate  | s6-rc: info: service go2rtc: starting
frigate  | s6-rc: info: service frigate-log successfully started
frigate  | s6-rc: info: service go2rtc successfully started
frigate  | s6-rc: info: service go2rtc-healthcheck: starting
frigate  | s6-rc: info: service go2rtc-healthcheck successfully started
frigate  | 2024-02-15 16:15:09.807817155  [INFO] Preparing new go2rtc config...
frigate  | 2024-02-15 16:15:10.031540475  [INFO] Starting go2rtc...
frigate  | 2024-02-15 16:15:10.084627126  16:15:10.084 INF go2rtc version 1.8.4 linux/amd64
frigate  | 2024-02-15 16:15:10.084949847  16:15:10.084 INF [api] listen addr=:1984
frigate  | 2024-02-15 16:15:10.085001997  16:15:10.084 INF [rtsp] listen addr=:8554
frigate  | 2024-02-15 16:15:10.085163211  16:15:10.085 INF [webrtc] listen addr=:8555
frigate  | 2024-02-15 16:15:19.805892905  [INFO] Starting go2rtc healthcheck service...
frigate  | 
frigate  | Creating yolov7x-640.cfg and yolov7x-640.weights
frigate  | 
frigate  | Done.
frigate  | 
frigate  | Generating yolov7x-640.trt. This may take a few minutes.
frigate  | 
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:147: DeprecationWarning: Use network created with NetworkDefinitionCreationFlag::EXPLICIT_BATCH flag instead.
frigate  |   builder.max_batch_size = MAX_BATCH_SIZE
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:149: DeprecationWarning: Use set_memory_pool_limit instead.
frigate  |   config.max_workspace_size = 1 << 30
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:172: DeprecationWarning: Use build_serialized_network instead.
frigate  |   engine = builder.build_engine(network, config)

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

now because I keep getting this error I reverted back to my working config and now frigate won't start and still get this error.
so now everything is broken as Frigate doesn't seem to be able to pull the default model file and even if I add YOLO_MODELS=""
to the environment, it just hangs i don't know what to do now omg i so hope i dont have to reimage my os arrrrrrrrr

docker-compose logs
frigate  | s6-rc: info: service s6rc-fdholder: starting
frigate  | s6-rc: info: service s6rc-oneshot-runner: starting
frigate  | s6-rc: info: service s6rc-oneshot-runner successfully started
frigate  | s6-rc: info: service fix-attrs: starting
frigate  | s6-rc: info: service s6rc-fdholder successfully started
frigate  | s6-rc: info: service fix-attrs successfully started
frigate  | s6-rc: info: service legacy-cont-init: starting
frigate  | s6-rc: info: service legacy-cont-init successfully started
frigate  | s6-rc: info: service trt-model-prepare: starting
frigate  | s6-rc: info: service log-prepare: starting
frigate  | Generating the following TRT Models: yolov7-320
frigate  | Downloading yolo weights
frigate  | s6-rc: info: service log-prepare successfully started
frigate  | s6-rc: info: service nginx-log: starting
frigate  | s6-rc: info: service go2rtc-log: starting
frigate  | s6-rc: info: service frigate-log: starting
frigate  | s6-rc: info: service nginx-log successfully started
frigate  | s6-rc: info: service go2rtc-log successfully started
frigate  | s6-rc: info: service go2rtc: starting
frigate  | s6-rc: info: service frigate-log successfully started
frigate  | s6-rc: info: service go2rtc successfully started
frigate  | s6-rc: info: service go2rtc-healthcheck: starting
frigate  | s6-rc: info: service go2rtc-healthcheck successfully started
frigate  | 2024-02-15 17:43:37.725544850  [INFO] Preparing new go2rtc config...
frigate  | 2024-02-15 17:43:37.929703130  [INFO] Starting go2rtc...
frigate  | 2024-02-15 17:43:37.981624034  17:43:37.981 INF go2rtc version 1.8.4 linux/amd64
frigate  | 2024-02-15 17:43:37.982020482  17:43:37.981 INF [api] listen addr=:1984
frigate  | 2024-02-15 17:43:37.982022143  17:43:37.981 INF [rtsp] listen addr=:8554
frigate  | 2024-02-15 17:43:37.982160709  17:43:37.982 INF [webrtc] listen addr=:8555
frigate  | 
frigate  | Creating yolov7-320.cfg and yolov7-320.weights
frigate  | 
frigate  | Done.
frigate  | 2024-02-15 17:43:47.720917256  [INFO] Starting go2rtc healthcheck service...
frigate  | 
frigate  | Generating yolov7-320.trt. This may take a few minutes.
frigate  | 
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:147: DeprecationWarning: Use network created with NetworkDefinitionCreationFlag::EXPLICIT_BATCH flag instead.
frigate  |   builder.max_batch_size = MAX_BATCH_SIZE
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:149: DeprecationWarning: Use set_memory_pool_limit instead.
frigate  |   config.max_workspace_size = 1 << 30
frigate  | /usr/local/src/tensorrt_demos/yolo/onnx_to_tensorrt.py:172: DeprecationWarning: Use build_serialized_network instead.
frigate  |   engine = builder.build_engine(network, config)
mike@mike-System-Product-Name:~/frigate$ 


version: '3.9'
services:
  frigate:
    container_name: frigate
    privileged: true # this may not be necessary for all setups
    restart: unless-stopped
    image: ghcr.io/blakeblackshear/frigate:stable-tensorrt #ghcr.io/blakeblackshear/frigate:0.13.0-tensorrt #ghcr.io/blakeblackshear/frigate:stable-tensorrt  #ghcr.io/blakeblackshear/frigate:0.12.1 #ghcr.io/blakeblackshear/frigate:stable # blakeblackshear/frigate:stable
    deploy:    # <------------- Add this section
      resources:
        reservations:
          devices:
            - driver: nvidia
              device_ids: ['0'] # this is only needed when using multiple GPUs
 
              capabilities: [gpu]
 # deploy:    # <------------- Add this section
    shm_size: "1000mb" # update for your cameras based on calculation
    devices:
      - /dev/nvidia0:/dev/nvidia0
      - /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
    volumes:
      - /etc/localtime:/etc/localtime:ro
      - /home/mike/frigate/:/config
      - /mnt/hubfiles:/media/frigate:rw
    #  - /home/mike/frigate:/db
     # - /home/mike/frigate/trt-models:/trt-models
      - type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
        target: /tmp/cache
        tmpfs:
          size: 2000000000      
      
    ports:
      - "5000:5000"
      - "1935:1935"
      - "8554:8554"
    environment:
     # - FRIGATE_RTSP_PASSWORD: "1234"  #modify to whatever if using rtsp  
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=all
     # - YOLO_MODELS=""
detectors:
#  tensorrt:
  #  type: tensorrt
  #  device: 0 # This is the default, select the first GPU
  coral1:
    type: edgetpu
    device: usb
#model:
 # path: /config/model_cache/tensorrt/yolov7-320.trt    #trt-models/yolov7-tiny-416.trt
 # input_tensor: nchw
 # input_pixel_format: rgb
 # width: 320
  #height: 320```

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

Had to dump my database re pulled v13 image and all works with tensorrt on default yolo version
. If tried to use another version of yolo again same error on start and no frigate UI. I think I'll try dumping the database again and try to use a different version yolo in environment config on first start see if I get the errors again

from frigate.

FigJam23 avatar FigJam23 commented on May 30, 2024

No mater what if I manually set the yolo version under the environment . it says it's downloading in logs and says done then errors . But no yolo file is populated in the models dir. the only file in the Dir isbthe default one frigate downloads if nothing is specific under environment options.

from frigate.

NickM-27 avatar NickM-27 commented on May 30, 2024

Going to close this as the original issue is solved. Feel free to create a new issue if something else comes up

from frigate.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.