GithubHelp home page GithubHelp logo

camera_pose_calibration's Introduction

Camera Pose Calibration

Table of contents

Prerequisites

Install ROS: http://wiki.ros.org/ROS/Installation

Description

This package calculates the pose (extrinsics) of a camera with respect to a fixed frame. The output transform between the camera frame and fixed frame is optionally published on the ROS server, connecting the tf tree. The package expects as input an image and a registered point cloud with the OpenCV asymmetric circle pattern clearly visible in the image.

Assumptions

  • The tf between camera and asymmetric circle pattern is static (does not change over time)
  • The tf of the asymetric circle pattern and the fixed frame is published on the ROS server
  • The asymmetric circle pattern can be detected in the image, and registered point cloud data is available for the corresponding points
  • This package works only for the OpenCV asymmetric circle pattern because the pose of an asymmetric circle pattern is uniquely defined.
  • Other standard calibration patterns such as chessboard pattern and circles pattern do not have this property and because of that not supported by this package.
  • The image data (senror_msgs::Image) are published in the 'image_color' topic. Notice that the image data can also be black & white. The topic can be changed from here
  • The depth data (sensor_msgs::PointCloud2) are published in the 'points_registered' topic. This can be changed from here
  • The image and depth data must be published with the exact same timestamp. Otherwise, the package enters an infinite loop. If you cannot ensure exact timing, there is a workaround

Pattern

The asymmetric circle pattern can be taken from https://docs.opencv.org/2.4/_downloads/acircles_pattern.png: alt text

Print this pattern and mount it on a known pose with respect to fixed frame.

Considering this example image of the asymmetric circle pattern of width 4 and height 11:

  • The right-top corner blob is defined as the origin.
  • The x-axis points to the left.
  • The y-axis points down.
  • The z-axis points from your screen towards you.

The tf from calibration tag to the fixed frame should be published on the ROS server. One way to do this is by supplying this as a link in the URDF and bringing the tf's of the URDF to the ROS server. Alternatively, this tf can be published by a static transform publisher.

Services

Three services are provided, choose whatever service you need: All three of them perform the same processing for calibration. They differ only in the way the image and point cloud is passed.

  • /calibrate_call: The image and point cloud data is given in the service call request.
  • /calibrate_file: The image and point cloud path is given, the calibration is performed with these files, useful for debugging with saved data.
  • /calibrate_topic: The image and point cloud are received from the ROS server.

Configuration parameters are given in the service call request:

  • tag_frame: Name of the asymmetric circle frame on the ROS server
  • target_frame: Tf frame to connect camera to
  • point_cloud_scale_x: Optional scale factor between image and point cloud (default 1)
  • point_cloud_scale_y: Optional scale factor between image and point cloud (default 1)
  • pattern: Pattern msg type with distance and size of the asymmetric circles pattern, more details explained here: Pattern msg.

The detection of the asymmetric circle pattern will commence after calling the service. The pose of the camera to the fixed frame will then be returned and optionally published.

Pattern parameters msg

The patternParametersMsg contains the following fields:

  • pattern_width and pattern_height: The number of circles in horizontal and vertical direction, as defined by OpenCV: https://docs.opencv.org/3.0-beta/doc/tutorials/calib3d/camera_calibration/camera_calibration.html
  • pattern_distance: The circle center-to-center distance of the printed pattern.
  • neighbor_distance: Optional averaging of point cloud's (x, y, z) values within certain distance in pixels
  • valid_pattern_ratio_threshold: Optional acceptance threshold for the ratio of valid points to point cloud's NaN values

Planned features

  • Correction for perspective in determining the blob center.
  • Create services for calibrating with two intensity images and the reprojection matrix

camera_pose_calibration's People

Contributors

dbolkensteyn avatar de-vri-es avatar hgaiser avatar mihaimorariu avatar tassos avatar wilson-ko avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

camera_pose_calibration's Issues

No output + Unclear Things

I've my pattern fixed on the workstation, and my "world" (which is another fixed frame) frame's origin, let's say, is my pattern's origin, since I don't have a robot and I don't want to have a different frame. So my camera and the world frames are basically having the same origin.

Given this, I execute the following commands:

rosrun tf static_transform_publisher 0.0 0.0 0.0 0.0 0.0 0.0 1.0 world pattern 1000

roslaunch camera camera.launch

roslaunch camera_pose_calibration nodelet.launch

rosservice call /calibrate_topic "tag_frame: 'pattern' target_frame: 'map' point_cloud_scale_x: 1.0 point_cloud_scale_y: 1.0 pattern: {pattern_width: 4, pattern_height: 11, pattern_distance: 4.0}"

However the last command doesn't return anything. No errors, no results, nothing. I check rostopic echo /detected_pattern and there is nothing in there either. Here is my how stereo images look like:

1
2

I don't know what's wrong, but it doesn't seem to work.

Apart from that, there are a couple of issues with this package, which makes the whole process frustrating.

  1. The right-top corner blob is defined as the origin must be marked on that image. Depending on how you look at this pattern, the statement is ambiguous at the best.

  2. Chain of commands is missing. What to run first, what's the order? Static publisher and then the nodelet, or?

  3. The pose of the camera to the fixed frame will then be returned and optionally published. the command returns nothing, and nothing is published. There is no single error message or whatsoever. One spends a lot of time without a result.

  4. The circle center-to-center distance of the printed pattern. which unit is the distance in? Cm?

Question!

I am so sorry to bother you guys, I am pretty new in the ROS community. My question is, in what way I can set up the configuration of parameters, in order to make the extrinsic calibration, I already run both launch file but i got this message:
[ INFO] [1548793318.831677234]: waitForService: Service [/standalone_nodelet/load_nodelet] has not been advertised, waiting...
[ INFO] [1548793318.852237981]: Initializing nodelet with 4 worker threads.
[ INFO] [1548793318.874800026]: waitForService: Service [/standalone_nodelet/load_nodelet] is now available.

Best regards!!!

target_frame parameter is unused

master branch:

When using calibrateFile, part of the service request is target_frame. This parameter appears to be unused. Here's the output when srv.tag_frame is set to calibration_circles_frame, and srv.target_frame is set to base_link. Notice that base_link doesn't appear.

Have you guys actually tested the package yet? The tf's I'm getting are not even close to aligned right, but that could very well be user error on my end.
unused_target_frame

Indigo onCalibrateTopic: stuck in infinite loop

I'm getting stuck in this while loop of node.cpp:

    sync.registerCallback(boost::bind(&synchronizationCallback, boost::ref(promise), _1, _2));

    // Run a background spinner for the callback queue.
    ros::AsyncSpinner spinner(1, &queue);
    spinner.start();

    ROS_INFO_STREAM("Done creating background spinner.");

    while (true) {  **<== I get stuck here**
    if (!node_handle.ok()) return false;
    if (future.wait_for(boost::chrono::milliseconds(100)) == boost::future_status::ready) break;
    }
    ROS_INFO_STREAM("Done waiting for the future.");

The synchronizationCallback is never called. I think this means the node is not receiving the cloud_topic and image_topic messages, but I've double-checked that the camera_pose_calibration node is subscribed correctly and the camera topics are active. I have a fork of the repo here if you want to see exactly what my service client looks like.

dr_base can not be found

Hi,
i want to calibrate my kinect for more accuracy in order to build a 3D map , the problem with this package is the dependency 'dr_base' , i can not find it.

when i try to compile the package The catkin_make command give this error :

Could not find the required component 'dr_base'.

please help

Pattern msg type

I am calibrating a kinect with this package but I couldnot find enough description on how to define a Pattern msg type with distance and size of the asymmetric circles pattern. I am using the pattern from this link: https://nerian.com/support/resources/patterns/ with A4 size patterns.

Can you help me fix this? Thanks in advance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.