GithubHelp home page GithubHelp logo

Comments (13)

mintar avatar mintar commented on August 18, 2024

Where do you see that assumption? I've tested it in a lot of different initial orientations without problems. The only thing that's buggy is the initialization code, with all the Euler angles and stuff, but it's supposed to work for all initial orientations. Also see #36 .

from imu_tools.

paulbovbel avatar paulbovbel commented on August 18, 2024

It does work for all starting orientations of ENU IMU, where the acceleration vector points along +ve z when the filter should output (0, 0, 0) for orientation.

For an NED EMU, when the sensor is oriented (0,0,0), acceleration points along -ve z, since +ve z is pointed towards the ground. However imu_filter_madgwick determines that the sensors is upside down, because it does not know that imu_link is not ENU aligned.

I've confirmed this with our NED Microstrain IMU: processing the sensor's acceleration and gyro through imu_filter_madgwick when the sensor is upright provides (pi,0,0) orientation, however the IMU's internal EKF is outputting (0,0,0), because it is an NED sensor.

So in essence, adding an 'imu_reference_frame' that IS ENU aligned will transform incoming data into ENU, specifically where acceleration points +ve z when the sensor is (0,0,0). This will also correct the issue of unorthodox mounting positions that are present on some of our robots, where an ENU IMUis mounted sideways.

from imu_tools.

mintar avatar mintar commented on August 18, 2024

Ah, I see. So the problem you want to solve is that you want the output of imu_filter_madgwick and the internal EKF to be interchangeable, correct?

I don't think I like the proposed solution, however. In my mind, there is no such thing as a "ENU sensor" or "NED sensor"; the device just provides accelerations, velocities and magnetic field readings, and it's up to the fusion stage to merge them into an orientation, so that's where the decision between ENU and NED is made, regardless of what's printed on the IMU housing.

REP 103 mandates the use of ENU, and I think it's important to stick with this convention in ROS (or rather, stick with any convention, as long as there's exactly one).

However, REP 103 offers a way out:

For outdoor systems where it is desireable to work under the north east down (NED) convention, define an appropriately transformed secondary frame with the "_ned" suffix.

This is similar to what ROS camera drivers (e.g. openni) are doing; their main tf frame (e.g., camera_rgb_frame) follows standard ROS conventions (x forward, y left, z up), and in addition, they provide an appropriately transformed secondary frame (e.g., camera_rgb_optical_frame) that follows image processing conventions.

So what I think we should do is:

  • continue publishing Imu messages as ENU
  • optionally, publish Imu messages as NED on a secondary topic that has the _ned suffix, where all messages have been transformed into a tf frame with the _ned suffix (e.g., imu_frame_ned); ros/geometry2#78 would come in really handy here. :-)
  • optionally, also publish the (static) TF between imu_frame and imu_frame_ned (like the openni driver does when setting publish_tf to true; too bad we're already using that parameter name for something else).

Your internal EKF should probably simply publish its messages in the imu_frame_ned frame, and things should work. It would probably be better to transform its messages into the imu_frame instead.

paulbovbel wrote:

This will also correct the issue of unorthodox mounting positions that are present on some of our robots, where an ENU IMU is mounted sideways.

That should be handled by a proper transform from base_link to imu_frame instead. That's how we do it for our robot Calvin. We have a "NED IMU" mounted "upright" according to the printed axes, i.e. "upside down" from imu_filter_madgwick's ENU perspective (gravity points towards -z). In Calvin's URDF, we simply provide the appropriate transformation that describes how the IMU is mounted on our robot. If you don't use URDF/robot_state_publisher on your robot, simply use tf's static transform publisher. You would need to provide such a transform anyway for your proposed imu_reference_frame.

from imu_tools.

paulbovbel avatar paulbovbel commented on August 18, 2024

Okay, so disregarding the ENU/NED discussion for now, although I do agree it's important and would like to deal with IMUs in a REP in the near future. The reality is, IMUs may come in one flavour or the other, and as long as an NED publishes with a default frame_id of imu_link_ned, putting the data into ENU is just a transform away. But this is just in terms of the IMU. If the IMU is mounted sideways, imu_link is now neither ENU or NED in the geographic scheme of things. This is okay, not all frames can be geographically ENU/NED (silly example i.e. wheels), as long as there's a context (e.g. base_link) up the tree that is.

I'm aiming to use the output of imu_madgwick_filter to replace the internal EKF on the UM6. If my NED IMU publishes data in frame /imu_link_ned, simply plugging it into imu_filter_madgwick will not work. I first need to transform the IMU data into an ENU referenced frame. This may be imu_link, or if the IMU is mounted sideways, it may be base_link instead.

See the gist here for my test setups, sorry it's not highlighted, but if you click edit you can turn on XML formatting
https://gist.github.com/paulbovbel/84dd372ee08c92a66399

At the end of the day, the driver should not be listening to TF, and transforming the data into a geographically correct ENU/NED format. The driver should dump the data from the device in a consistent form (ENU or NED) which is what happens in the wild today. The robot integrator sets up the contextual information (i.e. the URDF). The receivers of the IMU data can then make informed decisions about how to process it. In the case of imu_filter_madgwick, we can already assert that incoming data has to be ENU.

To that end, my proposal is that instead of users having to add an intermediary node to transform the data to ENU, they could instead specify an ENU reference frame (e.g. base_link) that the filter madgwick can use to pre-process the data.

from imu_tools.

paulbovbel avatar paulbovbel commented on August 18, 2024

Actually alternatively, it may be easiest just to write an imu_transformer node/nodelet combo, would you be willing to add this to imu_tools? This would keep the filter madwick code from getting too complicated.

The reasoning for all this is that any nodelet running in the same manager as the filter has to be GPL licensed, so this seems like the most appropriate home.

from imu_tools.

mintar avatar mintar commented on August 18, 2024

I believe I'm still missing your point. On our robot, we have an IMU that's mounted "upside down" (i.e., the gravity vector goes to -z as reported by the IMU driver; it's an "NED" imu if you go by the axes that are printed on it). Also, its x axis points to the back of the robot. This is reflected in the URDF, as can be seen here (it's a bit hard to see, but the IMU z axis points down and the base_footprint z axis points up).

imu_calvin

The URDF is read by robot_state_publisher, which publishes the TF transform base_footprint --> imu (and others, of course). The IMU driver output is processed by imu_filter_madgwick, and we feed the output into robot_pose_ekf.

Maybe here's the part you're missing: robot_pose_ekf transforms the orientations from the imu topic into the base_footprint frame, so it doesn't matter in which frame they are given, as long as the base_footprint --> imu tf is correct. (BTW, robot_pose_ekf only transforms the orientations, not the covariance matrices, which is probably a bug).

If we were to mount the IMU in any other way, all we would have to do is make sure that the URDF still properly reflects the base_footprint --> imu tf. Theoretically, this would even work if we would mount the IMU on the wheels, although it's a terrible idea. :-)

If you want to have a closer look, it's all in the calvin_bringup package. The top-level launch file is calvin.launch, and the only other ones of interest for you are imu.launch and ekf.launch.

from imu_tools.

mintar avatar mintar commented on August 18, 2024

Actually alternatively, it may be easiest just to write an imu_transformer node/nodelet combo, would
you be willing to add this to imu_tools? This would keep the filter madwick code from getting too
complicated.

+1 for using a separate node/nodelet. However, @chadrockey already has an imu_transformer node, and I'd rather use that (or ask him to also provide a nodelet) than duplicate functionality between imu_tools and imu_pipeline.

The reasoning for all this is that any nodelet running in the same manager as the filter has to be GPL licensed, so this seems like the most appropriate home.

Oh crap. I'll have to read up a bit more on this, and see whether we can re-license imu_filter_madgwick under LGPL if all authors agree, and whether that would actually help at all (see #28).

Alternatively, if your tf2 PR mentioned earlier goes through, then transformation would be literally one line of code, so we could skip the transformer altogether and simply expect the client to transform the data into the desired frame (like robot_pose_ekf already does, and like many point cloud processing nodes do).

from imu_tools.

chadrockey avatar chadrockey commented on August 18, 2024

TF is probably the best place for this. I don't really believe nodelets are necessary for the total bandwidth of IMU data over the headaches that they provide in reliability and debugging of live systems, especially given the GPL issues. Typically it is sufficient to turn Nagle's algorithm off to achieve low latency, high rate IMU data.

I did, in fact, try to get imu_filter_madgwick relicensed but the discussion fell a little flat and I didn't have a lot of motivation other than that it was a nice filter.

Now a days I think that porting the Apache V2 sensor fusion from Android would be a good choice:
https://android.googlesource.com/platform/frameworks/native/+/jb-mr1.1-dev/services/sensorservice

It has a good tuning for cellphone quality sensors, support accelerometer, gyro, and magnetometers, and is pretty well tested by gamers, possibly by project Tango, and by others all over the world. It also is run in Android in various modes with or without sensors. Maybe give it a shot?

from imu_tools.

paulbovbel avatar paulbovbel commented on August 18, 2024

@chadrockey, I will look into porting the Apache licensed code in the near future, would definitely be a big help. My immediate goal though is to get imu_filter_madgwick working with our Husky robot so that we can do an Indigo release.

@mintar, LGPL would definitely help, would allow loading the madgwick nodelet in a nodelet manager along with BSD code.

I'll be honest, I'm not really sure what the disconnect is. My adventures with IMUs for the past few weeks have led to me finding a lot of inconsistencies. From what I've found for the UM6, the data needs to be transformed into an ENU frame before hitting imu_filter_madgwick. Although we have an ENU mode baked into the driver, this doesn't save me because the IMU is mounted sideways.

To that end, would you be willing to help out with an imu_tools release including an imu_transformer nodelet? I'm hesistent to pull on imu_pipeline since it's currently organized as one top-level package. It looks like there's interest in changing that (ros-perception/imu_pipeline#3 (comment)), but that would be a pretty big change for indigo at this late point. @chadrockey, would you be interested in pulling an updated transformer into imu_pipeline for jade?

from imu_tools.

chadrockey avatar chadrockey commented on August 18, 2024

I don't have a lot of interest in IMU pipeline these days, but I'd be glad to give up/share maintainership.

from imu_tools.

paulbovbel avatar paulbovbel commented on August 18, 2024

@chadrockey I'd be happy to take over or share maintainership if you're willing.

from imu_tools.

asimay avatar asimay commented on August 18, 2024

hi, @paulbovbel , I wonder to know, before hitting imu_filter_madgwick, i just only need to transform from body-NED W.R.T NED -> body-ENU WRT NED is ok? or totally transform from body-NED W.R.T NED -> body-ENU WRT ENU? Thanks!

from imu_tools.

mintar avatar mintar commented on August 18, 2024

@asimay Please stop spamming unrelated issues with your question. You've already opened #63 , I'll answer there shortly.

from imu_tools.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.