GithubHelp home page GithubHelp logo

tixiaoshan / lio-sam Goto Github PK

View Code? Open in Web Editor NEW
3.2K 80.0 1.2K 128.09 MB

LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping

License: BSD 3-Clause "New" or "Revised" License

CMake 1.60% C++ 85.46% Python 12.29% Dockerfile 0.65%
lidar-odometry lidar-slam loam-velodyne lidar-inertial 3d-mapping slam ouster-slam velodyne-slam ouster velodyne

lio-sam's Introduction

LIO-SAM

A real-time lidar-inertial odometry package. We strongly recommend the users read this document thoroughly and test the package with the provided dataset first. A video of the demonstration of the method can be found on YouTube.

drawing

drawing drawing drawing drawing

Menu

System architecture

drawing

We design a system that maintains two graphs and runs up to 10x faster than real-time.

  • The factor graph in "mapOptimization.cpp" optimizes lidar odometry factor and GPS factor. This factor graph is maintained consistently throughout the whole test.
  • The factor graph in "imuPreintegration.cpp" optimizes IMU and lidar odometry factor and estimates IMU bias. This factor graph is reset periodically and guarantees real-time odometry estimation at IMU frequency.

Dependency

This is the original ROS1 implementation of LIO-SAM. For a ROS2 implementation see branch ros2.

  • ROS (tested with Kinetic and Melodic. Refer to #206 for Noetic)
    sudo apt-get install -y ros-kinetic-navigation
    sudo apt-get install -y ros-kinetic-robot-localization
    sudo apt-get install -y ros-kinetic-robot-state-publisher
    
  • gtsam (Georgia Tech Smoothing and Mapping library)
    sudo add-apt-repository ppa:borglab/gtsam-release-4.0
    sudo apt install libgtsam-dev libgtsam-unstable-dev
    

Install

Use the following commands to download and compile the package.

cd ~/catkin_ws/src
git clone https://github.com/TixiaoShan/LIO-SAM.git
cd ..
catkin_make

Using Docker

Build image (based on ROS1 Kinetic):

docker build -t liosam-kinetic-xenial .

Once you have the image, start a container as follows:

docker run --init -it -d \
  -v /etc/localtime:/etc/localtime:ro \
  -v /etc/timezone:/etc/timezone:ro \
  -v /tmp/.X11-unix:/tmp/.X11-unix \
  -e DISPLAY=$DISPLAY \
  liosam-kinetic-xenial \
  bash

Prepare lidar data

The user needs to prepare the point cloud data in the correct format for cloud deskewing, which is mainly done in "imageProjection.cpp". The two requirements are:

  • Provide point time stamp. LIO-SAM uses IMU data to perform point cloud deskew. Thus, the relative point time in a scan needs to be known. The up-to-date Velodyne ROS driver should output this information directly. Here, we assume the point time channel is called "time." The definition of the point type is located at the top of the "imageProjection.cpp." "deskewPoint()" function utilizes this relative time to obtain the transformation of this point relative to the beginning of the scan. When the lidar rotates at 10Hz, the timestamp of a point should vary between 0 and 0.1 seconds. If you are using other lidar sensors, you may need to change the name of this time channel and make sure that it is the relative time in a scan.
  • Provide point ring number. LIO-SAM uses this information to organize the point correctly in a matrix. The ring number indicates which channel of the sensor that this point belongs to. The definition of the point type is located at the top of "imageProjection.cpp." The up-to-date Velodyne ROS driver should output this information directly. Again, if you are using other lidar sensors, you may need to rename this information. Note that only mechanical lidars are supported by the package currently.

Prepare IMU data

  • IMU requirement. Like the original LOAM implementation, LIO-SAM only works with a 9-axis IMU, which gives roll, pitch, and yaw estimation. The roll and pitch estimation is mainly used to initialize the system at the correct attitude. The yaw estimation initializes the system at the right heading when using GPS data. Theoretically, an initialization procedure like VINS-Mono will enable LIO-SAM to work with a 6-axis IMU. (New: liorf has added support for 6-axis IMU.) The performance of the system largely depends on the quality of the IMU measurements. The higher the IMU data rate, the better the system accuracy. We use Microstrain 3DM-GX5-25, which outputs data at 500Hz. We recommend using an IMU that gives at least a 200Hz output rate. Note that the internal IMU of Ouster lidar is an 6-axis IMU.

  • IMU alignment. LIO-SAM transforms IMU raw data from the IMU frame to the Lidar frame, which follows the ROS REP-105 convention (x - forward, y - left, z - upward). To make the system function properly, the correct extrinsic transformation needs to be provided in "params.yaml" file. The reason why there are two extrinsics is that my IMU (Microstrain 3DM-GX5-25) acceleration and attitude have different cooridinates. Depend on your IMU manufacturer, the two extrinsics for your IMU may or may not be the same. Using our setup as an example:

    • we need to set the readings of x-z acceleration and gyro negative to transform the IMU data in the lidar frame, which is indicated by "extrinsicRot" in "params.yaml."
    • The transformation of attitude readings might be slightly different. IMU's attitude measurement q_wb usually means the rotation of points in the IMU coordinate system to the world coordinate system (e.g. ENU). However, the algorithm requires q_wl, the rotation from lidar to world. So we need a rotation from lidar to IMU q_bl, where q_wl = q_wb * q_bl. For convenience, the user only needs to provide q_lb as "extrinsicRPY" in "params.yaml" (same as the "extrinsicRot" if acceleration and attitude have the same coordinate).
  • IMU debug. It's strongly recommended that the user uncomment the debug lines in "imuHandler()" of "imageProjection.cpp" and test the output of the transformed IMU data. The user can rotate the sensor suite to check whether the readings correspond to the sensor's movement. A YouTube video that shows the corrected IMU data can be found here (link to YouTube).

drawing

drawing

Sample datasets

  • Download some sample datasets to test the functionality of the package. The datasets below are configured to run using the default settings:

  • The datasets below need the parameters to be configured. In these datasets, the point cloud topic is "points_raw." The IMU topic is "imu_correct," which gives the IMU data in ROS REP105 standard. Because no IMU transformation is needed for this dataset, the following configurations need to be changed to run this dataset successfully:

    • The "imuTopic" parameter in "config/params.yaml" needs to be set to "imu_correct".
    • The "extrinsicRot" and "extrinsicRPY" in "config/params.yaml" needs to be set as identity matrices.
  • Ouster (OS1-128) dataset. No extrinsics need to be changed for this dataset if you are using the default settings. Please follow the Ouster notes below to configure the package to run with Ouster data. A video of the dataset can be found on YouTube:

  • Livox Horizon dataset. Please refer to the following notes section for paramater changes.

  • KITTI dataset. The extrinsics can be found in the Notes KITTI section below. To generate more bags using other KITTI raw data, you can use the python script provided in "config/doc/kitti2bag".

Run the package

  1. Run the launch file:
roslaunch lio_sam run.launch
  1. Play existing bag files:
rosbag play your-bag.bag -r 3

Other notes

  • Loop closure: The loop function here gives an example of proof of concept. It is directly adapted from LeGO-LOAM loop closure. For more advanced loop closure implementation, please refer to ScanContext. Set the "loopClosureEnableFlag" in "params.yaml" to "true" to test the loop closure function. In Rviz, uncheck "Map (cloud)" and check "Map (global)". This is because the visualized map - "Map (cloud)" - is simply a stack of point clouds in Rviz. Their postion will not be updated after pose correction. The loop closure function here is simply adapted from LeGO-LOAM, which is an ICP-based method. Because ICP runs pretty slow, it is suggested that the playback speed is set to be "-r 1". You can try the Garden dataset for testing.

drawing drawing

  • Using GPS: The park dataset is provided for testing LIO-SAM with GPS data. This dataset is gathered by Yewei Huang. To enable the GPS function, change "gpsTopic" in "params.yaml" to "odometry/gps". In Rviz, uncheck "Map (cloud)" and check "Map (global)". Also check "Odom GPS", which visualizes the GPS odometry. "gpsCovThreshold" can be adjusted to filter bad GPS readings. "poseCovThreshold" can be used to adjust the frequency of adding GPS factor to the graph. For example, you will notice the trajectory is constantly corrected by GPS whey you set "poseCovThreshold" to 1.0. Because of the heavy iSAM optimization, it's recommended that the playback speed is "-r 1".

drawing

  • KITTI: Since LIO-SAM needs a high-frequency IMU for function properly, we need to use KITTI raw data for testing. One problem remains unsolved is that the intrinsics of the IMU are unknown, which has a big impact on the accuracy of LIO-SAM. Download the provided sample data and make the following changes in "params.yaml":
    • extrinsicTrans: [-8.086759e-01, 3.195559e-01, -7.997231e-01]
    • extrinsicRot: [9.999976e-01, 7.553071e-04, -2.035826e-03, -7.854027e-04, 9.998898e-01, -1.482298e-02, 2.024406e-03, 1.482454e-02, 9.998881e-01]
    • extrinsicRPY: [9.999976e-01, 7.553071e-04, -2.035826e-03, -7.854027e-04, 9.998898e-01, -1.482298e-02, 2.024406e-03, 1.482454e-02, 9.998881e-01]
    • N_SCAN: 64
    • downsampleRate: 2 or 4
    • loopClosureEnableFlag: true or false

drawing drawing

  • Ouster lidar: To make LIO-SAM work with Ouster lidar, some preparations need to be done on hardware and software level.
    • Hardware:
      • Use an external IMU. LIO-SAM does not work with the internal 6-axis IMU of Ouster lidar. You need to attach a 9-axis IMU to the lidar and perform data-gathering.
      • Configure the driver. Change "timestamp_mode" in your Ouster launch file to "TIME_FROM_PTP_1588" so you can have ROS format timestamp for the point clouds.
    • Config:
      • Change "sensor" in "params.yaml" to "ouster".
      • Change "N_SCAN" and "Horizon_SCAN" in "params.yaml" according to your lidar, i.e., N_SCAN=128, Horizon_SCAN=1024.
    • Gen 1 and Gen 2 Ouster: It seems that the point coordinate definition might be different in different generations. Please refer to Issue #94 for debugging.

drawing drawing

  • Livox Horizon lidar: Please note that solid-state lidar hasn't been extensively tested with LIO-SAM yet. An external IMU is also used here rather than the internal one. The support for such lidars is based on minimal change of the codebase from mechanical lidars. A customized livox_ros_driver needs to be used to publish point cloud format that can be processed by LIO-SAM. Other SLAM solutions may offer better implementations. More studies and suggestions are welcome. Please change the following parameters to make LIO-SAM work with Livox Horizon lidar:
    • sensor: livox
    • N_SCAN: 6
    • Horizon_SCAN: 4000
    • edgeFeatureMinValidNum: 1
    • Use livox_ros_driver for data recording

drawing

Service

  • /lio_sam/save_map
    • save map as a PCD file.
        rosservice call [service] [resolution] [destination]
      • Example:
        $ rosservice call /lio_sam/save_map 0.2 "/Downloads/LOAM/"

Issues

  • Zigzag or jerking behavior: if your lidar and IMU data formats are consistent with the requirement of LIO-SAM, this problem is likely caused by un-synced timestamp of lidar and IMU data.

  • Jumpping up and down: if you start testing your bag file and the base_link starts to jump up and down immediately, it is likely your IMU extrinsics are wrong. For example, the gravity acceleration has negative value.

  • mapOptimization crash: it is usually caused by GTSAM. Please install the GTSAM specified in the README.md. More similar issues can be found here.

  • gps odometry unavailable: it is generally caused due to unavailable transform between message frame_ids and robot frame_id (for example: transform should be available from "imu_frame_id" and "gps_frame_id" to "base_link" frame. Please read the Robot Localization documentation found here.

Paper

Thank you for citing LIO-SAM (IROS-2020) if you use any of this code.

@inproceedings{liosam2020shan,
  title={LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping},
  author={Shan, Tixiao and Englot, Brendan and Meyers, Drew and Wang, Wei and Ratti, Carlo and Rus Daniela},
  booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={5135-5142},
  year={2020},
  organization={IEEE}
}

Part of the code is adapted from LeGO-LOAM.

@inproceedings{legoloam2018shan,
  title={LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain},
  author={Shan, Tixiao and Englot, Brendan},
  booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={4758-4765},
  year={2018},
  organization={IEEE}
}

TODO

Related Package

Acknowledgement

  • LIO-SAM is based on LOAM (J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time).

lio-sam's People

Contributors

2scholz avatar grischi avatar hmellor avatar inntoy avatar kdaun avatar pallav1299 avatar spritelin-zju avatar timple avatar tixiaoshan avatar valgur avatar yanbc avatar zang09 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lio-sam's Issues

ICP compared with method of point matching against line or surface

I have noticed that in your LOOP_Closure you use the ICP method to calculate the transforamtion and in scan2MapOptimization() functionyou use method of point matching against line or surface. So I want to ask can the ICP in the LOOP_Closure be replaced by the method in scan2MapOptimization() ?

Some problems occurred during operation

2020-07-15 21-15-10屏幕截图
Hello. I used your public data to run your program, and this happened. The z coordinate keeps dropping. What is the problem? Is my configuration file not modified?

[Question] function projectPointCloud in imageProjection.cpp

Hello, I'm studying LIO-SAM and I intensively read that code.
Under is about my question.

Through publishClouds in imageProjection.cpp, cloudInfo, i.e., deskew point, is passed to featurExtraction.cpp. At this time, the cloudExtraction function gives the rangeMat.at(i,j) value to cloudInfo.pointRange, which is the range value measured before removing the distortion of the point in the projectPointCloud function. After giving the range value, thisPoint is removed distortion using deskewPoint.

In featureExtraction.cpp, smoothness is calculated using an distorted range value(cloudInfo.pointRange), and I want to know why diff is measured using distorted range instead of undistorted range.

[Enhancement] Document software interfaces

Hi Tixiao,
By means of this issue I propose documenting the software interfaces. Currently the package uses ROS1 as middle-ware, and most interfaces are topics/services. Tf is also used I think, and here it would be interesting to describe the structure of the Tf tree.

I would suggest diving the documentation into:

  • Topics: two tables, one for inputs, one for outputs. Each table has the topic name, frequency, short description, message type.
  • Services: one table, with service name, short description and service type.
  • Tf: tf tree and who listens / broadcasts which transform.

Please let me know what you think about this proposal. If you're ok with it I'll gradually prepare a PR 👍

Best,
Yoshua

Question about Extrinsics

Hi:
Thanks for sharing you work!
I am confused with the part of Extrinsics in params.yaml, the Extrinsics represent transform from lidar to imu that meas transform measurement in lidar coordinate to imu coordinate. However, in utility.h, function imuConverter,
acc = extRot * acc; and gyr = extRot * gyr;
I think it is used to transform the measurement in imu coordinate to lidar coordinate, is it inconsistent with the definition in yaml file?
Look forward to your reply!

[Question] Accuracy and method for LIDAR-IMU extrinsic calibration

Hi Tixiao,
The results from this package look very impressive.

By means of this issue I would like to ask you the following three questions:

  1. What's the expected accuracy of the LIDAR->IMU transformation?

  2. How do you estimate it? Do you use CAD or an online estimation package?

  3. Can this package improve upon an inaccurate calibration over time?

Thank you in advance.

Best,
Yoshua Nava

redundant projection calculations

I noticed the pointRange is computed in by float pointDistance which is an expensive operation as it requires computing a square root for each point (and there may be over 2 million points per second with Ouster OS1-128).

But the raw data from the lidar already has the range, so it is not necessary to recompute it again.

The PointOS1 defined at https://github.com/ouster-lidar/ouster_example/blob/master/ouster_ros/include/ouster_ros/point_os1.h contains a useful range field which contains the raw range reading.

I suggest making the PointType a template parameter for pointDistance, and do a template specialization for PointOS1 to use the raw range reading directly without recomputing it.

Likewise the horizon angle is redundantly computed by an expensive call to atan2, which is not necessary, because the raw data from the lidar contains this information already in the raw UDP packets, both for Velodyne and Ouster lidars. Unfortunately using pcl::PointXYZI causes this information to be lost, requiring it to be recomputed. For Ouster lidars specifically, the horizon angle does not change between frames, and the driver always outputs all the points (even points without valid range readings), so it is possible to compute the horizon angle from the index of the point. However, the angle is not stored in PointOS1 so it may be more difficult to get the data from raw lidar data. But then atan2 is much slower than sqrt so it may be worth it.

atan2 typically costs 100 cycles, so high resolution lidars producing 2.6 million points per second (e.g. Ouster OS1-128 in 2048 mode) will be starting to be a significant performance penalty.

How to align map with Google Earth

Hi:
Thank you for sharing your work!
I've noticed that in your paper you align the map with Google Earth like the following figure, could you tell me how to do that?
image

[Enhancement] Move class declarations to headers and add doxygen strings

Dear Tixiao,
This work is really awesome. Thank you for sharing it with the community.

By means of this issue I would like to suggest introducing headers for each class, and adding doxygen strings to the methods.

Please let me know what you think. I would be happy to submit pull requests for one module at a time, if possible with no changes to the underlying functionality.

Best,
Yoshua

[Enhancement] Decouple class definition from ROS nodes execution

Currently class definitions, ROS node instantiation and execution are embedded in the same source files.

This issue proposes decoupling both class definition from node instantiation.

I propose a convention where we could name classes with CamelCase.h / cpp, and nodes in low case, with the suffix node at the end (e.g. camel_case_node.cpp)

IMU rate for calculating discrete-time noise parameter

Hello,Dr. Shan. Excellent Work! With respect to imu preintegration, you use gtsam's implementation. The question is that where can we offer the imu rate for calculating discrete-time noise parameter? I didn't find the imu rate setup in your code. And I assume the imu nosie parameter in params.yaml is in continuous-time. Am I right? Thank you very much

kitti2bag.py doesn't add timestamps to pointcloud data

I tried LIO-SAM on 2011_09_30_drive_0016 (1.1 GB) Kitti raw data. Using the generated rosbag as input, I got the following warning: Point cloud timestamp not available, deskew function disabled, system will drift significantly!.

Questions about the initialization

Dear Tixiao,
This work is really awesome. Thank you for sharing it !
I have noticed that LIO-SAM performs great with the rotation dataset, while LIOM is not able to initialize properly using this dataset. So I want to ask what kind of initialization method did LIO-SAM use and what's the difference compared to LIOM.
Looking forward to your answer~

Question about Parameter extrinsicTrans and transformation betwen imu and lidar

Hi, Tixiao.
From the picture about lidar and imu coordinates from readme, it is shown that the translation bewtween the origin of lidar and imu is a few centimeters. My question is why you set [0,0,0] for the parameter extrinsicTrans.
Second question is about the formula which transform imu pose to lidar pose, where in imuPreintegration.cpp line 74,75:
gtsam::Pose3 lidar2Imu = gtsam::Pose3(gtsam::Rot3(1, 0, 0, 0), gtsam::Point3(extTrans.x(), extTrans.y(), extTrans.z()));; In your code, you firstly use imuConverter() to align the axis of the two coordinates, and then use lidar2Imu or imu2Lidar to get that. I use this formula to transform imupose to lidarpose, where the rotation matrix of T_imu_lidar is identity matrix.
T_lidar0_lidarn = T_lidar_imu * T_imu0_imun * T_imu_lidar
And this is different from the imu2Lidar in your code. Please give me some advice! Thank you.

How to save map?

Hi, I edit params.yaml and change savePCD: true.
But it doesn't work .
the code:
ros::Rate rate(0.2);
while (ros::ok()){
rate.sleep();
publishGlobalMap();
}

    if (savePCD == false)
        return;

when ros::ok() wiil shun down?
Thanks!

Transform from lidar frame to base_link is not taken into account?

I noticed that a base_link with an offset from the lidar frame leads to erroneous behavior. In that case, the pose of the lidar is not tracked correctly. I first noticed this because the deskewed point cloud and the original point cloud don't match up. While the deskewed cloud remains in place, the original cloud just moved all over the place.

From what I can tell this happens because the transform from the base_link to the lidar is never taken into account. As a result, the pose of the base_link is tracked when the pose of the lidar frame should be tracked instead. Basically: the base_link takes the path that the lidar frame should take.

I hope it became clear what I mean. The error should be easy to reproduce by adding a large offset between the base_link and the lidar frame. It should be immediately apparent when looking at the frames and point clouds in rviz.

poseCovariance

Shouldn't the covariance of the pose in the Mapping odometry increase constantly. I noticed that the poseCovariance sometimes just fluctuated.

Some questions about the 9-axis IMU

Hi,@TixiaoShan,Thank you for your great work! I don't know why you have to use a 9-axis IMU,and the magnetometer in the IMU is not used in your code,Can you tell me the reason for using a 9-axis IMU instead of a 6-axis IMU?
Thank you very much!

updateInitialGuess()

Hi @TixiaoShan,
When get the current laser's initialize pose in map, i debug this code in updateInitialGuess() function in mapOptmization.cpp:

 if (cloudInfo.imuAvailable == true) 
    {   
       //this line is me added 
        ROS_WARN("odom = %d, cloud = %d, self = %d ", cloudInfo.odomAvailable, cloudInfo.imuPreintegrationResetId , imuPreintegrationResetId);

        Eigen::Affine3f transBack = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit);
        Eigen::Affine3f transIncre = lastImuTransformation.inverse() * transBack;

        Eigen::Affine3f transTobe = trans2Affine3f(transformTobeMapped);
        Eigen::Affine3f transFinal = transTobe * transIncre;
        pcl::getTranslationAndEulerAngles(transFinal, transformTobeMapped[3], transformTobeMapped[4], transformTobeMapped[5], 
                                                      transformTobeMapped[0], transformTobeMapped[1], transformTobeMapped[2]);

        lastImuTransformation = pcl::getTransformation(0, 0, 0, cloudInfo.imuRollInit, cloudInfo.imuPitchInit, cloudInfo.imuYawInit); // save imu before return;
        return;
    }

The output is:

process[lio_sam_rviz-8]: started with pid [14288]
[ INFO] [1593869689.438855893]: ----> IMU Preintegration Started.
[ INFO] [1593869689.471981403]: ----> Image Projection Started.
[ INFO] [1593869689.498746662]: ----> Feature Extraction Started.
[ INFO] [1593869689.507955480]: ----> Map Optimization Started.
[ WARN] [1593869693.421098667]: odom = 0, cloud = 0, self = 0 
[ WARN] [1593869693.626720210]: odom = 0, cloud = 0, self = 0 
[ WARN] [1593869869.932765664]: odom = 1, cloud = 0, self = 1 
[ WARN] [1593869870.131129145]: odom = 0, cloud = 0, self = 1 
[ WARN] [1593869874.965915153]: odom = 1, cloud = 1, self = 2 
[ WARN] [1593869875.179717063]: odom = 0, cloud = 1, self = 2 
[ WARN] [1593869875.376471333]: odom = 0, cloud = 1, self = 2 
[ WARN] [1593869905.031495240]: odom = 1, cloud = 2, self = 3 
[ WARN] [1593869905.231531192]: odom = 0, cloud = 2, self = 3 
[ WARN] [1593869909.896946401]: odom = 1, cloud = 3, self = 4 
[ WARN] [1593869910.136724098]: odom = 1, cloud = 3, self = 4 
[ WARN] [1593869910.275975347]: odom = 0, cloud = 3, self = 4 
[ WARN] [1593869919.954947659]: odom = 1, cloud = 4, self = 5 
[ WARN] [1593869920.157889474]: odom = 0, cloud = 4, self = 5 
[ WARN] [1593869925.202043337]: odom = 1, cloud = 5, self = 6 
[ WARN] [1593869925.428638865]: odom = 0, cloud = 5, self = 6 
[ WARN] [1593869930.039206332]: odom = 1, cloud = 6, self = 7 
[ WARN] [1593869930.244444627]: odom = 1, cloud = 6, self = 7 
[ WARN] [1593869935.081837640]: odom = 1, cloud = 7, self = 8 
[ WARN] [1593869935.286080622]: odom = 0, cloud = 7, self = 8 
[ WARN] [1593869944.967022261]: odom = 1, cloud = 8, self = 9 
[ WARN] [1593869945.161093601]: odom = 0, cloud = 8, self = 9 
[ WARN] [1593869950.010743418]: odom = 1, cloud = 9, self = 10 
[ WARN] [1593869950.215691434]: odom = 1, cloud = 9, self = 10 
[ WARN] [1593870259.647486738]: odom = 0, cloud = 10, self = 10 
[ WARN] [1593870325.011413956]: odom = 1, cloud = 10, self = 11 
[ WARN] [1593870325.214021085]: odom = 0, cloud = 10, self = 11 
[ WARN] [1593870329.867247998]: odom = 1, cloud = 11, self = 12 
[ WARN] [1593870330.052590294]: odom = 0, cloud = 11, self = 12 
[ WARN] [1593870330.254257450]: odom = 0, cloud = 11, self = 12 
[ WARN] [1593870335.117750271]: odom = 1, cloud = 12, self = 13 
[ WARN] [1593870335.295067667]: odom = 0, cloud = 12, self = 13 
[ WARN] [1593870335.505050773]: odom = 0, cloud = 12, self = 13 
[ WARN] [1593870344.980313780]: odom = 1, cloud = 13, self = 14 
[ WARN] [1593870345.181445314]: odom = 0, cloud = 13, self = 14 
[ WARN] [1593870345.382959890]: odom = 0, cloud = 13, self = 14 
^C[lio_sam_rviz-8] killing on exit
[navsat-7] killing on exit
[ekf_gps-6] killing on exit
[lio_sam_mapOptmization-5] killing on exit
[lio_sam_featureExtraction-4] killing on exit
[lio_sam_imageProjection-3] killing on exit
[lio_sam_imuPreintegration-2] killing on exit
****************************************************
Saving map to pcd files ...
Processing feature cloud 956 of 957 ...****************************************************
Saving map to pcd files completed
[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done

The cloudInfo.odomAvailable in each msg is not always true when after initialized, is it normal?
Thanks for your help!

[Enhancement] Add linter

Hi Tixiao,
By means of this issue I want to suggest supporting clang format and clang tidy to:

  1. Keep the code well-formatted, accelerate contributions and reduce time spent discussing formatting.
  2. Discover possible issues and find out ways to modernize the code thanks to clang code analysis.

Please let me know what you think.

Best,
Yoshua

extrinsicRPY meaning in readme

Hi @TixiaoShan, thanks for your excellent work !
when reading about extrinsicRPY in doc and imuConverter() function, i don't understand this param's meaning.
Imu's orientation is q_b0_bk, which b0 is the imu's world frame, b0 imu frame in its world frame in time stamp k.
extrinsicRot is q_l2b, different from extrinsicRPY. What's the meaning of q_b0_bk * extQRPY?
It isn't q_b0_lk = q_b0_bk * q_l2b.traspose(), nor q_l0_lk = q_l2b * q_b0_bk * q_l2b.traspose(), right?

how to use gps to evaluation

Hi,
I want to use the gps provided by your data sets as the ground truth for evaluation, but its coordinate system and frequency are different from the system output, could you tell me how do you solve this? Thank you very much!

Why use two factor graph?

Hi !
I have read through your paper and code, and I find there are two factor graph in the project, so several questions have arisen:
Q1. the result of imu preintegration seems only used to provide a initial estimate for map optimization, even not be used to deskew pointcloud when sensor move slowly, right?
Q2. why not combine imu preintegration factor with the factors in map optimization, just like what VINS-MONO does?(use all the constraints to joint optimization simultaneously)

and I also have the question about Fig 1 in your paper:
Q3. in the figure, we see the robot state node is associated with keyframe, and imu factor is provide a constraint between two successive nodes, however I find in the code, imu factor is added when receiving the pose of lidar frame(rather than keyframe), so does this figure not well match the code, or I don't understand correctly?

Looking forward to your reply !
Thank you !

[Enhancement] Remove default BUILD_TYPE flag

Currently the package has a fixed build_type flag:

set(CMAKE_BUILD_TYPE "Release")
set(CMAKE_CXX_FLAGS "-std=c++11")
set(CMAKE_CXX_FLAGS_RELEASE "-O3 -Wall -g -pthread")

This is not ideal. Release with -O3 introduces lots of optimizations, which is great for real-time operation. But it doesn't allow external users to build the package with other flags (for example, 'RelWithDebInfo', which combines debug symbols with release optimizations).

I propose removing the release flag, and rather document in the wiki that for optimal performance the package should be build in release mode.

Please let me know what you think about this proposal :)

Best,
Yoshua

How about the performance of a low cost imu ?

Thanks for your great job. It's really awesome.
I have googled that 3DM-GX5-25 costs about $1600, which is a high performance IMU. So, I'm curious about the importance of the inertial measurement accuracy in your algorithm, will a consumer grade IMU do the same job ?

Run big-loop.bag data set with different results

Hi,@TixiaoShan,I run big-loop.bag dataset, the results of loop closure is different,and Sometimes the results was good, sometimes it was bad! (Three times out of ten the results were good) Did I miss some key steps? Thanks!

The results are as follows:

20200714181842

20200714182539

What is reason for two extrinsic rotation?

First, I really appreciate your code! It is really good to understand recent lidar-based SLAM.

My question is, What is the reason for two extrinsic rotation? extrinsicRot and extrinsicRPY.
extrinsicRPY seems to be used for orientation, and transformed rotation is used in initialization.
Can we just use extrinsicRot (which used to transform imu info to lidar frame) for all process?

Extrinsic Calibration

Hi,

Thanks again for your great work.
Can you explain how did you obtain extrinsic parameters between Lidar and IMU?
I need to obtain extrisics between Lidar and IMU for a ground vehicle such as car. Lidar is placed at the top of the car and IMU is placed at the luggage of the car. It is not possible to move the car arround all three axes and along all three axes (the motion is approximately planar). So, hand-eye calibration fails for the height between lidar and IMU. Can you recommend any tool or code that provides extrinsic calibration between IMU and Lidar ?

Thanks in advance.

Evaluation using KITTI Vision Benchmark

Hi,

Thank you sharing your work.
Is it possible to evaluate LIO-SAM on KITTI Odometry Dataset? Can you explain how to make it possible ?

Thanks in advance.

Performance when sensors are relatively fast

Hello!
Thank you for a great job.

I was looking at the LIO-SAM code, and the following commented out section caught my attention.

https://github.com/TixiaoShan/LIO-SAM/blob/master/src/imageProjection.cpp#L450-L459

        // If the sensor moves relatively slow, like walking speed, positional deskew seems to have little benefits. Thus code below is commented.

        // if (cloudInfo.odomAvailable == false || odomDeskewFlag == false)
        //     return;

        // float ratio = relTime / (timeScanNext - timeScanCur);

        // *posXCur = ratio * odomIncreX;
        // *posYCur = ratio * odomIncreY;
        // *posZCur = ratio * odomIncreZ;

What were the results when sensors are relatively fast and you uncommented above?
Also, is there a dataset that I can try?

The error result for Campus dataset (large) dataset

Hi, @TixiaoShan, thanks for your excellent work !
There are some questions for Campus dataset (large) dataset. I have changed the imuTopic to imu_correct and extrinsicRot, extrinsicRPY to Identity matrix in config/params.yaml.
And the rosbag plays at normal speed:rosbag play big_loop.bag
But the final result is not as expected. There is a big shift in the global map.
Here is the result in my computer.

And this is the demo result in LIO-SAM Paper.

The red and black block is the different area.
The detail is shown as:

How can I get the correct result?
Thanks very much!

Some questions for the euler angles integration

Hi, @TixiaoShan , thanks for your excellent work !
In imageProjection.cpp/imuDeskewInfo() function, the euler integration is written as:

            // get angular velocity
            double angular_x, angular_y, angular_z;
            imuAngular2rosAngular(&thisImuMsg, &angular_x, &angular_y, &angular_z);

            // integrate rotation
            double timeDiff = currentImuTime - imuTime[imuPointerCur-1];
            imuRotX[imuPointerCur] = imuRotX[imuPointerCur-1] + angular_x * timeDiff;
            imuRotY[imuPointerCur] = imuRotY[imuPointerCur-1] + angular_y * timeDiff;
            imuRotZ[imuPointerCur] = imuRotZ[imuPointerCur-1] + angular_z * timeDiff;
            imuTime[imuPointerCur] = currentImuTime;

The angular_x, angular_y, angular_z is the angular velocity in IMU message.
We can get the angle integration by last angles and current angular velocity and time.
But the euler angle can't be added directly.
Is it better to write using quaternion?
The quaternion for two rotation could be multiply directly, such as:

            // get angular velocity
            double angular_x, angular_y, angular_z;
            imuAngular2rosAngular(&thisImuMsg, &angular_x, &angular_y, &angular_z);

            // integrate rotation
            double timeDiff = currentImuTime - imuTime[imuPointerCur-1];
            
            // using quaternion
            tf::Quaternion last_orientation;
            last_orientation = tf::createQuaternionFromRPY(imuRotX[imuPointerCur-1], imuRotY[imuPointerCur-1], imuRotZ[imuPointerCur-1]);
            tf::Quaternion delta_orientation;
            delta_orientation = tf::createQuaternionFromRPY(angular_x * timeDiff, angular_y * timeDiff, angular_z * timeDiff);
            tf::Quaternion curr_orientation = last_orientation * delta_orientation;
            double imuRoll, imuPitch, imuYaw;
            tf::Matrix3x3(curr_orientation).getRPY(imuRoll, imuPitch, imuYaw);
            imuRotX[imuPointerCur] = imuRoll;
            imuRotY[imuPointerCur] = imuPitch;
            imuRotZ[imuPointerCur] = imuYaw;
            imuTime[imuPointerCur] = currentImuTime;

Problems with other dataset

Thank your for your great job! Since LIO-SAM and LIO-mapping both use lidar and imu data to run slam. I tested your code with dataset(slow2.bag) provided by LIO-mapping and tested LIO-mapping with your dataset(casual_walk.bag). Though I modified extrinsic rotation and translation, both of two tests failed(generating meaningless map). What's the problem? Is it because your imu data is used in a different way?

Base_link problem?

Thanks for your great job. It's so amazing
I use the original parameter in your params.yaml, and run the park.bag file. The rviz always show me some error on base_link. AS following the figure1. The ros system shows me error message as following figure2.
Do you know in which steps I have a problem?

Bless you have the good day.

figure1.
1

figure2.
2

[Enhancement] Silence warnings from external dependencies

Currently, the cmake file specifies include directories as follows:

include_directories(
	include
	${catkin_INCLUDE_DIRS}
	${PCL_INCLUDE_DIRS}
  ${OpenCV_INCLUDE_DIRS}
	${GTSAM_INCLUDE_DIR}
)

However, there is no distinction between internal and external source files (e.g. PCL, OpenCV). I propose that we add the CMake include dirs SYSTEM flag so that the compiler only outputs warnings for internal source files.

Example:

include_directories(
	include
        SYSTEM
  	  ${catkin_INCLUDE_DIRS}
	  ${PCL_INCLUDE_DIRS}
          ${OpenCV_INCLUDE_DIRS}
	  ${GTSAM_INCLUDE_DIR}
)

poseCovariance not growing

shouldn't the covariance of the odometry growing? I have noticed that the covariance of the odometry in the code doesn't growing, just fluctating.

Initial rotation between GPS odometry and Optimized path

This is somehow related to #22, which is closed now. Therefore asking here.

I tried the package on Campus dataset(small).

When not using GPS, the package performs very well.
Some results:
image
image

On comparing the GPS odometry with the optimized path, both look quite similar. The initial rotation seems to be the only difference between the two.

But when using GPS, there are some serious artifacts in the output PCD.
Some results:
image

Now I have the following questions:

  1. What causes the initial rotation misalignment between GPS odometry and Optimized path?
  2. Can we create correct(with least artifacts) global PCD map with LIO-SAM?
  3. Any method for evaluating LIO-SAM? (I tried evaluating with KITTI ground truth sequences, the elevation data was inconsistent: [LINK)](#35 (comment). But this can be misleading since we don't have correct IMU parameters for KITTI.)
  4. How well does LIO-SAM support car-like vehicles?

Using GPS with KITTI dataset

I am facing problems while setting LIO-SAM with GPS for the KITTI dataset. I am getting a lot of sudden jumps in the output Uploading gps_issue.png….

The package works fairly well without GPS. I think it's an issue with improper configuration. I seek some help regarding the same.

About the transform from imu to lidar for other datasets

Thank you for your work again! When I use the dataset in https://github.com/TixiaoShan/Stevens-VLP16-Dataset or my own dataset, the map will be in a chaotic state shoortly after the beginning. Just as the picture shows from the bag(2018-0518-14-49-12_0.bag)
Screenshot from 2020-07-14 14-12-21

I have checked the output of odometry and found that the data is wrong. And for my own data, I have known the relative position between imu and lidar, but after modifying the extrinsicRot and extrinsicRPY following the introduction in Readme, I still can not make LIO-SAM run properly. How does the following sentence mean ? "The transformation of attitude readings is slightly different. We rotate the attitude measurements by -90 degrees around "lidar-z" axis and get the corresponding roll, pitch, and yaw readings in the lidar frame."

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.