GithubHelp home page GithubHelp logo

rpng / mins Goto Github PK

View Code? Open in Web Editor NEW
432.0 432.0 75.0 38.85 MB

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.

License: GNU General Public License v3.0

CMake 1.07% C++ 98.21% Shell 0.72%

mins's People

Contributors

lnexenl avatar woosiklee2510 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mins's Issues

use mins to locate on a pre-built point cloud map

Thanks for your great work.

I would like to use mins for localization based on an pre-built point cloud map. In the original method of mins, the map frame anchored to the latest IMU frame which can cause the map frame to be affected by the state estimation . Therefore, my idea is not to transform the point cloud map to the clone pose in real time, and frame will be like:
map -- C -- C -- I

As a beginner, I am wondering if there are any suggestions or if the above idea is feasible?

IMU Coordinate and Camera coordinate

i want use my own dataset to test MINS system, after check the openvins project , i am really confused about the IMU coordinate and Camera coordinate , could you please just tell me what imu coordiante this system use:

what i understand:

camera:
[
Z: forward
X: left
Y: down
]

IMU [
X:?
Y: ?
Z: ?
]
what i need ito do is calculate the camera - imu rotation matrix.

now i don't know how to define imu coordinate .

Adopt Livox lidar

Thanks for your wonderful job. When I adopt livox lidar with point time relative to first point time, header time use first lidar point time, but traj is drift than undistorted pointcloud. what should i do to adopt livox lidar

Can MINS work on a drone?

Hi!

I read the paper of MINS, and I see the experiments were done with ground vehicles. I would assume it is usable on a drone, but I wanted to confirm this from a practical point of view. Would there be specific assumptions for MINS to be used one e.g., a multi-rotor UAV?

Thanks.

Lidar Undistortion

Hi, I currently run MINS on some private datasets, where a car goes around the figure eight, requiring a high-precision lidar undistortion function.

However, I find that lidar undistortion function requires v_angles and h_angles to undistort point cloud. And this function also assumes that the time differences between each point are same.
success = state->get_interpolated_pose(lidar_inL->time + dt - (total - i) * op->raw_point_dt, RGtoIi, pIiinG);

I think lidar undistortion function could be made more general, by getting each point's timestamp directly from ros msg. Is it possible to introduce this as a new feature in your future work?

Project Consulting

Thank you for your great work. What I want to ask is what is the relationship between this project and MIMC-VINS

Questions about LiDAR noise and consistency

Hi, thank you for opensouring this comprehensive sensor fusion system. I have some questions regarding the LiDARs.

In the LiDAR odometry part, you use a direct scan-submap registration as residual, which is similar to FASTLIO2. I am quite curious about the map_noise parameter used in simulation. If I understand correctly, this parameter is used to whiten the point-plane residual of the neighbor points found on the local map. However, the neighbor points can only be selected if they all pass a plane sanity check, i.e., their distance to the fitted plane should be smaller than plane_max_p2pd, and this is a hard constraint. However, the map_noise (related to the uncertainty we think it has) is 5x of plane_max_p2pd (related to the actually uncertainty), so it seems like a manual dilation of point-plane residual uncertainty, which should(?) lead to a smaller NEES (conservative).

So, my questions are the following:

  1. Why is the map_noise parameter much larger in simulation than in real experiments, given that they use the same sanity check threshold plane_max_p2pd, and the simulation has a much smaller LiDAR point measurement noise raw_noise?
  2. It seems that the three simulation scenes mentioned in the paper are used for three different purposes, and UD small is used for consistency verification. I'm wondering how consistent the system is on longer or more complex datasets (e.g., "C" shaped corridor)? Because UD small lasts only 60 seconds, and the whole scene is visible to the LiDARs throughout the experiment (i.e., local map never decays).
  3. How consistent is the system under different LiDAR noise levels? For example, raw_noise at level 2cm, 3cm, or 10cm as used in the real world experiments. How to change other parameters accordingly when this raw_noise changes?

The table below is excerpted from the configuration files of MINS:

KAIST / KAIST LC KAIST L Simulation
raw_downsample_size 2.0 1.0 0.3
raw_noise 0.1 0.1 0.01
map_downsample_size 0.5 0.3 0.3
map_noise 0.1 0.1 0.5
plane_max_p2pd 0.1 0.1 0.1
map_decay_time 30 9999 120
map_decay_dist 30 100 100

I hope you can clear up my questions. Thank you!

double free or corruption (out)

roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true
When I run the above command, the error shown below will appear。
double free or corruption (out)

Can you share the Trail dataset?

Thank you for your great work.
We have carefully read your paper.
We want to conduct research on your Trail dataset,Can you share it?
企业微信截图_1715831120333

how to implement with own hardware platform ?

Hi,
Thanks for great work. I am wondering if i can use this MINS algorithm with the hardware like ouster, stereo camera etc ? is there any detailed documentation to follow ?
Thanks in advance!

use `run_bag` but very slow

When I ran the rosbag.launch, the program runs very slow, it seems to run in real time.
Did I set some parameters wrong?
I just use camera and imu, set msckf points to 40, and slam points to 50

[Question] The formular in UpdaterWheel::preintegration_3D

I don't understand why p0_dot = R_Gto0.transpose() * v_hat; but not p0_dot = R_Gto0 * v_hat;

  // k1 ================
  Vector4d dq_0 = {0, 0, 0, 1};
  Vector4d q0_dot = 0.5 * Omega(w_hat) * dq_0;
  Matrix3d R_Gto0 = quat_2_Rot(quat_multiply(dq_0, q_local));
  Vector3d p0_dot = R_Gto0.transpose() * v_hat;

Is it because of the quaternion is in JPL style?

How MINS dealwith multi thread problem?

In the code, each sensor callback will use function "try to update" to update system state. And they all use StateHelper::EKFUpdate in the end.
My question is, what if two sensor come together, and call EKFUpdate at the same time?
Will this leads to coredump or memory corruption?

Test on Kaist dataset?

@WoosikLee2510 Dear author, may I ask why my tests on Kaist urban dataset urban18.bag are stratified, or misaligned (such as the light pole in the figure)? I am testing with the default config kaist_LC parameter
image
and, I noticed that at the beginning, the point cloud map was badly misaligned, seemingly rotated 90 degrees.
image

Do you have any advice?
Thanks!

[KAIST] Process died when use_imu_res is disabled.

Hi, Woosik. Really appreciate this great work!

I am currently running KAIST urban 30 dataset. Everything works fine when using subscribe.launch with the defaut kaist/kaist_LC config.

However, when I try to run with polynomial estimated residuals,
mins/config/kaist/kaist_LC/config_estimator.yaml-> use_imu_res: false,
the [mins_subscribe-2] process died on startup.
Screenshot from 2023-10-11 16-43-17

Could you take a look at this problem? Much appreciated.

Random crash of program

Hi, I find that MINS really works well, but sometimes it crashes my system, causing a reboot of my PC (no matter where I execute the program: inside or outside docker container). Have you ever met such problem? I am curious whether it's a common problem or it only happens on my PC. I guess this problem might be caused by invalid memory access.

Question about the feature representation

Dear MINS developer,

First of all, thanks for sharing your awesome work. According the report Visual-Inertial Odometry on Resource-Constrained Systems, compare between different feature representation, the performance of the AHP and IDP parameterizations is significantly better than that of the XYZ parameterization.

Currently, MINS only support GLOBAL_3D or GLOBAL_FULL_INVERSE_DEPTH, which I support it is XYZ parameterization in the paper. so the question is why MINS only support Global representation? Is it possible to use all feature representation (from Open-Vins) to integrate to MINS and improve the tracking stability? Thanks in advance.

Initialization need platform to move and sometimes failed to init (drift)

Hi MINS maintainers,

First of all, thank you for great work and share it to the public community. I would like to apply MINS to the mobile robot application, I'm using stereo camera + IMU (Intel d435i) , but there are some issues of MINS that I want to ask:

  1. Is there any way to make the system initialization while stay in one place, I assume that the VIO need platform to move to recover scale (in monocular camera), but in case of using stereo cameras, could I make the initiazation without moving?
  2. Sometimes the initialization failed and pose got drifted. How could I prevent drifting?
  3. I aware that CPU usage of MINS is much higher than that of Open-Vins ( in my PC, 150% CPU compare to 50% of OpenVINS). What cause the CPU usage higher?

rviz configuration settings for simulation.launch, rosbag.launch and subscribe.launch

@WoosikLee2510 @goldbattle @ghuangud @yangyulin @huaizheng @saimouli

After launching the files simulation.launch, rosbag.launch and subscribe.launch, iam unable to see proper visualization in rviz for setting tf transformation coordinates i.e Fixed Frame sttings in rviz and it is too confusing, tricky and time consuming. Can you plaese provide rvziz configuration for any of the one above launches so that i can roslaunch along with rviz appropriate settings. Requesting you kindly to do the needful

process[mins simulation0-1]: started with pid [7471]

Excuse me, I am running "roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true". What is the cause of the error? How to fix it?

6676fd3ff17ca00e003eacc1acc1215

And I'm using "roslaunch mins serial.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag "When running rosbag with your own kaist dataset, can the bag contain only the following topics? What are the topics of your bag?

a2d162de8c8437e781ab1e8ca21869d

Thank you very much.

Tw, Tg, Ta, R_IMUtoGYRO,Fail to appear

There is a transformation matrix T for the angular velocity and acceleration in the imu parameters, in openvins. But in mins this parameter is not in the yaml file, don't you need to worry about that?

Formular does not make sense

in

bool I_Initializer::initialization(Matrix<double, 17, 1> &imustate) {

...

  Vector3d z_axis = a_avg_2to1 / a_avg_2to1.norm();

  // Create an x_axis
  Vector3d e_1(1, 0, 0);

  // Make x_axis perpendicular to z
  Vector3d x_axis = e_1 - z_axis * z_axis.transpose() * e_1;
  x_axis = x_axis / x_axis.norm();

...

Why x_axis perpendicular z_axis? x_axis.dot(z_axis) is not equal to zero.

[label:question] Question about Jacobians `Phi_tr` in UpdaterWheel::preintegration_3D

in this line:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D.transpose() * (new_p - p_3D));

I think new_p is equal to (p_3D + R_3D.transpose()*v*dt), then the line comes to:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D.transpose() * (R_3D.transpose()*v*dt));

But it seems that the final result should be:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(v*dt));

So I think the origin line should be the following code:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D * (new_p - p_3D));

Just remove the .transpose()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.