GithubHelp home page GithubHelp logo

tumftm / fusiontracking Goto Github PK

View Code? Open in Web Editor NEW
62.0 2.0 10.0 576 KB

Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing

License: GNU General Public License v3.0

Dockerfile 0.26% Shell 1.62% Python 98.13%
fusion kalman-filter python ros2 ros2-node

fusiontracking's Introduction

DOI Linux Docker ROS2galactic Python 3.8

Multi-Modal Sensor Fusion and Object Tracking

The following figure outlines the high level structure of the algorithm, which covers the tasks of multi-modal sensor fusion and object tracking. The algorithm is developed for the Indy Autonomous Challenge 2021 and the Autonomous Challenge at CES 2022 and is part of the software of TUM Autonomous Motorsport.

Overview of object fusion and tracking

The sensor fusion handles multiple object lists that originate from different perception pipelines. The perception pipelines work independently from each other and output individual object lists. This algorithm combines the given information to output a unified object list. This late fusion approach allows us to incorporate a variable number of perception pipelines without any dependencies.

The object tracking addresses the estimation of the detected objects' dynamic states, which is realized by an Extended Kalman Filter (EKF) based on a constant turn-rate and velocity (CTRV)-model.

Requirements

  • OS: Ubuntu 22.04 LTS
  • Docker: 20.10.17
  • Docker Compose: v2.6.1
  • Python: 3.8
  • ROS2: galactic

Installation

Clone repository:

git clone https://github.com/TUMFTM/FusionTacking.git

Setup virtual environment and install requirements:

python3.8 -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Install tex extensions, necessary to plot with desired font:

sudo apt-get install texlive-latex-extra texlive-fonts-recommended dvipng cm-super

Data and Evaluation

The evaluation is entirely conducted with real-world data of team TUM Autonomous Motorsport from the AC@CES 2022. The recorded raw data of all tracking inputs stored in rosbags is available open-source (Link, uncompressed size: 16.4 GB). The data processing and evaluation procedure is described in the README. Follow the described steps to reproduce the evaluation.

Docker Image

It is recommended to run the ROS2 node of the module in a Docker container. To build the related image, execute:

docker build --pull --tag <image_name>:<tag> .
# e.g. docker build --pull --tag tracking:0.0.1 .

To run the container and launch the ROS2 node, run:

docker run <image_name>:<tag> ros2 launch tracking tracking.launch.py
# e.g. docker run tracking:0.0.1 ros2 launch tracking tracking.launch.py

It is recommended to mount a volume to save the logs during the node runs (see replay.yml for an example). Add additional parameters to the ros2 launch command if desired, see section Parameter and Files below. For further details about Docker and ROS2, we refer to the official documentations.

Parameter and Files

Directory: tracking

The directory tracking contains the source code (subfolder: tracking) and ROS2 launch configuration (subfolder: launch) of the module.

Files Description
tracking/tracking_node.py ROS2 main file to apply the MixNet
launch/tracking.launch.py ROS2 launch file with parameter definition

The launch description contains the following parameters:

Parameter Type Default Description
frequency float, int 50.0 Hz Cycle frequency of the ROS2 node
max_delay_ego_s float 0.15 s Threshold for ego state message delay
checks_enabled boolean False If true failed safety checks trigger emergency state of the module
track string LVMS Name of used race track map
use_sim_time boolean False Flag to use sim time instead of system time
ego_raceline string default String of used ego raceline of motion planner (default, inner, outer, center)
send_prediction boolean True If true a prediction is published

Add them at the end of the docker run-command. Example with modified frequency and enabled safety checks:

docker run tracking:0.0.1 ros2 launch tracking tracking.launch.py frequency:=100.0 checks_enabled:=True

Directory: tools

The directory tools contains the script to visualize logged data of the applied ROS2 Node. To visualize logged data of the tracking-node run:

python tools/visualize_logfiles.py

Logs must be stored in tracking/tracking/logs to be considered. Enter the number of the desired log or hit enter to run the latest. Add additional arguments if desired. Without any argument the overall tracking process will be shown (always recommended at first).

Additional Arguments:

  • --n_obs: Specifies the number of objects to show filter values / states of (default: 5)
  • --filter: Visualizes filter values of the <n_obs>-most seen objects (default: False)
  • --states: Visualizes the dynamic states of the <n_obs>-most seen objects (default: False)
  • --mis_num_obj: Visualizes the matching performance (default: False)

Module configuration

Besides the ROS2 parameter, which are changeable during runtime, there is an additional configuration file for all static parameters (i.e. not changeable during runtime). This config is stored in main_config_tracking.ini. A parameter description is given in the related README.

Qualitative Example

Below is an exemplary visualization of the fusion and tracking algorithm on the Las Vegas Motor Speedway. The input are the object lists of the LiDAR and the RADAR. The raw perception input is filtered for objects outside the driveable area and it is checked if there are multiple detections of the same object. Afterwards a distance-based matching associates the multi-modal object lists to the currently tracked objects. A kinematic tracking model based on an Extended Kalman Filter (EKF) is applied to optimize the state estimation. Additionally, the perception delay is compensated by backward-forward integration of the kinematic model to output an updated object list. The validation under real-world condition in autonomous racing with a stable object fusion and tracking at a maximum speed of 270 km/h proves the robustness of the proposed approach.

Example of Fusion and Tracking module.

Inference time

The average computation time of the module during the evaluation on the given data on a single core of an Intel i7-9850H 2.60 GHz CPU is 2.75 ms with a 90%-quantile of 5.26 ms.

References

P. Karle, F. Fent, S. Huch, F. Sauerbeck and M. Lienkamp, "Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing," in IEEE Transactions on Intelligent Vehicles, doi: 10.1109/TIV.2023.3271624.

BibTex:

@ARTICLE{Karle2023,
  author={Karle, Phillip and Fent, Felix and Huch, Sebastian and Sauerbeck, Florian and Lienkamp, Markus},
  journal={IEEE Transactions on Intelligent Vehicles}, 
  title={Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing}, 
  year={2023},
  pages={1-13},
  doi={10.1109/TIV.2023.3271624}
}

fusiontracking's People

Contributors

phkarle avatar tumftm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

fusiontracking's Issues

delphi_esr_msgs package error

Hi everyone,
I tried to build the docker image with this command: docker build --pull --tag tracking:0.0.1 .
But I have this error
ERROR: the following packages/stacks could not have their rosdep keys resolved to system dependencies:
tracking: Cannot locate rosdep definition for [delphi_esr_msgs]

I'm a beginner and looking for the package on the internet it appears unavailable for galactic.
Any advice on how to solve this problem?

AssertionError: track_path does not exist: /dev_ws/install/tracking/lib/tracking/data/map_data/traj_ltpl_cl_LVMS_GPS.csv

Steps taken:

  1. downloaded tracking data
  2. Moved tracking_input under data
  3. moved logs to tracking/tracking
  4. run docker run tracking:0.0.1

[INFO] [launch]: All log files can be found below /root/.ros/log/2022-12-17-08-05-57-735074-0413cc0aa91c-1
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [tracking_node-1]: process started with pid [47]
[tracking_node-1] Traceback (most recent call last):
[tracking_node-1] File "/dev_ws/install/tracking/lib/tracking/tracking_node", line 33, in
[tracking_node-1] sys.exit(load_entry_point('tracking==0.0.0', 'console_scripts', 'tracking_node')())
[tracking_node-1] File "/dev_ws/install/tracking/lib/python3.8/site-packages/tracking/tracking_node.py", line 153, in main
[tracking_node-1] tracking_node = TrackingNode()
[tracking_node-1] File "/dev_ws/install/tracking/lib/python3.8/site-packages/tracking/tracking_node.py", line 23, in init
[tracking_node-1] super().init(path_dict=PATH_DICT)
[tracking_node-1] File "/dev_ws/install/tracking/lib/tracking/utils/ros2_interface.py", line 109, in init
[tracking_node-1] self.get_params_from_config(path_dict=path_dict)
[tracking_node-1] File "/dev_ws/install/tracking/lib/tracking/utils/ros2_interface.py", line 287, in get_params_from_config
[tracking_node-1] _ = get_map_data(
[tracking_node-1] File "/dev_ws/install/tracking/lib/tracking/utils/map_utils.py", line 115, in get_map_data
[tracking_node-1] check_path_exist(main_class=main_class, key_str="track_path")
[tracking_node-1] File "/dev_ws/install/tracking/lib/tracking/utils/map_utils.py", line 145, in check_path_exist
[tracking_node-1] assert os.path.exists(main_class.params[key_str]), "{} does not exist: {}".format(
[tracking_node-1] AssertionError: track_path does not exist: /dev_ws/install/tracking/lib/tracking/data/map_data/traj_ltpl_cl_LVMS_GPS.csv
[ERROR] [tracking_node-1]: process has died [pid 47, exit code 1, cmd '/dev_ws/install/tracking/lib/tracking/tracking_node --ros-args --ros-args -r __node:=TRACKING -r __ns:=/ --params-file /tmp/launch_params_m6b06053'].

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.