GithubHelp home page GithubHelp logo

Comments (31)

Chris-Bee avatar Chris-Bee commented on May 18, 2024

@antithing thank you for your question and appologies for the delay. It almost seems as if the Aruco pose measurement is not processed. Can you assure that this section of the code is executed?

  if (frame == true)
  {
      mars::BufferDataType dataP;

      ReadLock w_lock(grabberLock);
      dataP = dataPose;
      w_lock.unlock();

      frame = false;

      core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, dataP);

      dataGot = true;
  }

Should core_logic.ProcessMeasurement be executed, can you please have a look at the residual of the pose sensor:

And check if they are unreasonably high?

Also, can you send us a plot that shows the Aruco pose and the MaRS pose? The position in X,Y,Z and orientation in Euler should give us a good idea about whats going on.

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Hi @Chris-Bee @mascheiber , thank you for getting back to me. I can confirm that the code is hit, and that the pose data is populated.

I am trying this to print the residual, but it doesn't seem to work, is there a trick here?

          const Eigen::IOFormat fmt(2, Eigen::DontAlignCols, "\t", " ", "", "", "", "");
                               std::cout << pose_sensor_sptr->residual_.transpose().format(fmt) << std::endl;

I also see:

Created: [IMU] Sensor
Created: [Pose] Sensor
Created: Buffer (Size=300)
Created: Buffer (Size=100)
Created: CoreLogic
Warning: Core is not initialized yet. Measurement is stored but not processed
States Initialized:
p_wi:   [ 0.0000000000 0.0000000000 5.0000000000 ]
v_wi:   [ 0.0000000000 0.0000000000 0.0000000000 ]
q_wi:   [ 1.0000000000 0.0000000000 0.0000000000 0.0000000000 ]
b_w:    [ 0.0000000000 0.0000000000 0.0000000000 ]
b_a:    [ 0.0000000000 0.0000000000 0.0000000000 ]
w_m:    [  0.0005693005  0.0005465498 -0.0042936552 ]
a_m:    [ -0.1660202444 -9.8073301315 -0.0989102647 ]

Info: Filter was initialized
47.4389300024 POSE
Info: Initialized [Pose] with [Given] Calibration at t=1692869067.0220751762
39.2860332607 POSE
Warning: The error angle vector is not small
Warning: The error angle vector is not small

29.9548995812 POSE
Warning: The error angle vector is not small
Warning: The error angle vector is not small

Are there tools in Mars to plot? Or will I need a third party function?

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Hi @antithing,

I appreciate the confirmation, however, I would have a couple of questions to help you further:

  1. I just had a look at our closed issues and saw you brought this also up in #3 (comment). Can you confirm the code you posted there is the same one you are still using?
  2. In this issue, you mentioned you are trying to use MaRS with a Stereolabs Zed2 camera. Is this still the case, i.e., are you using both the IMU and images from this (the same) device?
  3. In the snippet above, I see the tracked position x changing fast, thus, are you moving right after initializing MaRS?
47.4389300024 POSE
39.2860332607 POSE
29.9548995812 POSE

Concerning your questions:

I am trying this to print the residual, but it doesn't seem to work, is there a trick here?

Doing something like

std::cout << pose_sensor_sptr->residual_.transpose() << std::endl;

should be sufficient for what we need right now, no need to format.

Are there tools in Mars to plot? Or will I need a third-party function?

Within the MaRS lib, there are no tools provided. However, we tend to store the results in csv files and then plot the data using external tools such as Matlab or Python scripts. An example is given here:

// Open file for data export
std::ofstream ofile_core;
ofile_core.open("/tmp/mars_core_state.csv", std::ios::out);
ofile_core << std::setprecision(17);
std::ofstream ofile_pose;
ofile_pose.open("/tmp/mars_pose_state.csv", std::ios::out);
ofile_pose << std::setprecision(17);
// process data
for (auto k : measurement_data)
{
core_logic.ProcessMeasurement(k.sensor_, k.timestamp_, k.data_);
if (!core_logic.core_is_initialized_)
{
// Initialize the first time at which the propagation sensor occures
if (k.sensor_ == core_logic.core_states_->propagation_sensor_)
{
Eigen::Vector3d p_wi_init(0, 0, 5);
Eigen::Quaterniond q_wi_init = Eigen::Quaterniond::Identity();
core_logic.Initialize(p_wi_init, q_wi_init);
}
else
{
continue;
}
}
// Store results in a csv file
if (k.sensor_ == core_logic.core_states_->propagation_sensor_)
{
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_state(&latest_result);
mars::CoreStateType last_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;
ofile_core << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
}
if (k.sensor_ == pose_sensor_sptr)
{
// Repropagation after an out of order update can cause the latest state to be different from the current update
// sensor. Using get_latest_sensor_handle_state is the safest option.
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result);
mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
ofile_pose << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
}
}
ofile_core.close();
ofile_pose.close();

If you adhere to the convention of the csv writing (lines 157 and 167), the csv files would be enough at the moment to help you. We might consider including scripts for plotting in a future release!

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Thank you! I have checked through the code and stabilised the POSE output. I know see:

Created: [IMU] Sensor
Created: [Pose] Sensor
Created: Buffer (Size=300)
Created: Buffer (Size=100)
Created: CoreLogic
Camera Model: ZED 2i, Serial Number: 38306361, Camera Firmware: 1523, Sensors Firmware: 777

Connected to hardware, streaming data.
Warning: Core is not initialized yet. Measurement is stored but not processed
States Initialized:
p_wi:   [ 0.0000000000 0.0000000000 5.0000000000 ]
v_wi:   [ 0.0000000000 0.0000000000 0.0000000000 ]
q_wi:   [ 1.0000000000 0.0000000000 0.0000000000 0.0000000000 ]
b_w:    [ 0.0000000000 0.0000000000 0.0000000000 ]
b_a:    [ 0.0000000000 0.0000000000 0.0000000000 ]
w_m:    [ -0.0115738817 -0.0008016574  0.0024347773 ]
a_m:    [  0.0130601153 -9.8602676392  0.1160364896 ]

Info: Filter was initialized
-24.5619488665 POSE
Info: Initialized [Pose] with [Given] Calibration at t=1692958693.5246474743
0.0000000000 FILTER

-24.5423968424 POSE
Warning: The error angle vector is not small
Warning: The error angle vector is not small
-0.2328027102 FILTER

-24.5089796835 POSE
0.2700878108 FILTER

-24.5346974635 POSE
0.2700067273 FILTER

-24.5632441115 POSE
0.4150911955 FILTER

-24.5436141662 POSE
0.3611866790 FILTER

-24.5360139028 POSE
0.2753187662 FILTER

-24.5710572885 POSE
0.2938940999 FILTER

The blank line SHOULD be printing the residual, but it does not.

Yes, i am using the ZED camera, and yes the IMU data and image are both from that sensor.

I am attaching the full code here again in case you are able to take a look. I feel like I am close! But definitely missing something.

Thank you so much for your time.

main.zip

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Apologies for the late response and thank you for checkign the POSE output.

So I did take a look at your code and saw a couple of things:


mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
std::cout << last_state.p_ip_.x() << " FILTER " << std::endl;

With this code you are printing the calibration state of your pose sensor, i.e. the translation between the IMU and one of your cameras. That is the reason, why the value is so small, as they are expected to be close together.

To print the filter's core state you should rather do something like

core_logic_.buffer_.get_latest_state(&latest_state);
mars::CoreStateType latest_core_state = static_cast<mars::CoreType*>(latest_state.data_.core_.get())->state_;
std::cout << latest_core_state.p_wi_.x() << " FILTER " << std::endl;

In your code you are initializing the calibration states of your pose sensor (camera) and IMU to identity:

pose_init_cal.state_.p_ip_ = Eigen::Vector3d::Zero();
pose_init_cal.state_.q_ip_ = Eigen::Quaterniond::Identity();

Although I have never worked with the ZED cam, I highly doubt that there is no translational and rotational offset between these two sensors. While in therory, these states should become observable if you move, your estimation performance is better if they are initialized somewhat close to their real values. This in turn, might again explain some of the differences between your aruco pose estimation, and MaRS' pose.


Similarly, you are initializing the filter's core states to identity in orientation, and 5m height in translation:

// Initialize the first time at which the propagation sensor occures
if (imu_sensor_sptr == core_logic.core_states_->propagation_sensor_)
{
    Eigen::Vector3d p_wi_init(0, 0, 5);
    Eigen::Quaterniond q_wi_init = Eigen::Quaterniond::Identity();
    core_logic.Initialize(p_wi_init, q_wi_init);
}

But clearly, your first pose information is different (-24.5619488665 POSE). Thus, for better performance I would suggest waiting for the first pose information, and initialize the filter core states (p_wi_init and q_wi_init) using the first measrurement and sensor to IMU calibration (from the previous point).

We do something similar within the MaRS ROS wrappers:
https://github.com/aau-cns/mars_ros/blob/6f7d0109569bcf8f2b1e590662035959c10fed20/src/mars_wrapper_dualpose_full.cpp#L392-L395
and
https://github.com/aau-cns/mars_ros/blob/6f7d0109569bcf8f2b1e590662035959c10fed20/src/mars_wrapper_dualpose_full.cpp#L633-L635


Concerning the residual printing, I just saw that this is a bug within the pose sensor class, where the value is only stored locally but not written to the residual_ member variable.

Thus thank you for pointing this out, will be providing a fix soon.

For now you could add the printing the in the pose_sensor_class.h, line 204:

Eigen::MatrixXd res(res_p.rows() + res_r.rows(), 1);
res << res_p, res_r;


I do hope this helps, please let me know if you have any further questions to these points.

from mars_lib.

Chris-Bee avatar Chris-Bee commented on May 18, 2024

Thus thank you for pointing this out, will be providing a fix soon.

As discussed, fixed by @mascheiber in 365396b

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Hi @Chris-Bee thank you! I have made those changes. Now I see a better result when initialising:

Camera Model: ZED 2i, Serial Number: 38306361, Camera Firmware: 1523, Sensors Firmware: 777
Created: [IMU] Sensor
Created: [Pose] Sensor
Created: Buffer (Size=300)
Created: Buffer (Size=100)
Created: CoreLogic

Connected to hardware, streaming data.
Warning: Core is not initialized yet. Measurement is stored but not processed
-6.3754 POSE
States Initialized:
p_wi:   [  -6.3754000681 -22.7880870687 121.1385729168 ]
v_wi:   [ 0.0000000000 0.0000000000 0.0000000000 ]
q_wi:   [ 0.1011434760 0.0003335555 0.9948696135 0.0020828182 ]
b_w:    [ 0.0000000000 0.0000000000 0.0000000000 ]
b_a:    [ 0.0000000000 0.0000000000 0.0000000000 ]
w_m:    [ -0.0002072160  0.0005531951 -0.0000994732 ]
a_m:    [ -0.0421399549 -9.8122491837 -0.0917200148 ]

Info: Filter was initialized
Info: Initialized [Pose] with [Given] Calibration at t=1695110839.9820003510
-6.3754000681 FILTER
-6.3505564740 POSE

However when I start moving the camera, the values slip apart:

1.0283536997 POSE
-2.3626173940 FILTER
2.0762937818 POSE
0.4731203817 FILTER
4.4112029505 POSE
3.4899880149 FILTER

Does this section look correct now?

//Zed
   sl::SensorsConfiguration sensorConfig;
   sensorConfig = zed.getCameraInformation().sensors_configuration;

   //IMU to LEFT CAMERA
   double imuLeftX = sensorConfig.camera_imu_transform.getTranslation().x;
   double imuLeftY = sensorConfig.camera_imu_transform.getTranslation().y;
   double imuLeftZ = sensorConfig.camera_imu_transform.getTranslation().z;

   double imuLeftQX = sensorConfig.camera_imu_transform.getOrientation().x;
   double imuLeftQY = sensorConfig.camera_imu_transform.getOrientation().y;
   double imuLeftQZ = sensorConfig.camera_imu_transform.getOrientation().z;
   double imuLeftQW = sensorConfig.camera_imu_transform.getOrientation().w;

    // Display accelerometer sensor configuration
     sl::SensorParameters& sensor_parameters = info.sensors_configuration.accelerometer_parameters;

    sl::SensorsData sensors_data;
    SensorsData::IMUData imu_data;
    TimestampHandler ts;

    //mars lib stuff:
    std::vector<double> imu_n_w{ 0.013, 0.013, 0.013 };
    std::vector<double> imu_n_bw{ 0.0013, 0.0013, 0.0013 };
    std::vector<double> imu_n_a{ 0.083, 0.083, 0.083 };
    std::vector<double> imu_n_ba{ 0.0083, 0.0083, 0.0083 };

    // setup propagation sensor
    std::shared_ptr<mars::ImuSensorClass> imu_sensor_sptr = std::make_shared<mars::ImuSensorClass>("IMU");

    // setup the core definition
    std::shared_ptr<mars::CoreState> core_states_sptr = std::make_shared<mars::CoreState>();
    core_states_sptr.get()->set_propagation_sensor(imu_sensor_sptr);
    core_states_sptr.get()->set_noise_std(Eigen::Vector3d(imu_n_w.data()), Eigen::Vector3d(imu_n_bw.data()),
        Eigen::Vector3d(imu_n_a.data()), Eigen::Vector3d(imu_n_ba.data()));

    // setup additional sensors
    // Pose sensor
    std::shared_ptr<mars::PoseSensorClass> pose_sensor_sptr = std::make_shared<mars::PoseSensorClass>("Pose", core_states_sptr);
    pose_sensor_sptr->const_ref_to_nav_ = true;  // TODO is set here for now but will be managed by core logic in later versions

    // Define measurement noise
    Eigen::Matrix<double, 6, 1> pose_meas_std;
    pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
    pose_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);

    // Define initial calibration and covariance
    mars::PoseSensorData pose_init_cal;
    pose_init_cal.state_.p_ip_ = Eigen::Vector3d(imuLeftX, imuLeftY, imuLeftZ);
    pose_init_cal.state_.q_ip_ = Eigen::Quaterniond(imuLeftQW, imuLeftQX, imuLeftQY, imuLeftQZ);

    // The covariance should enclose the initialization with a 3 Sigma bound
    Eigen::Matrix<double, 6, 1> std;
    std << 0.1, 0.1, 0.1, (10 * M_PI / 180), (10 * M_PI / 180), (10 * M_PI / 180);
    pose_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();

    pose_sensor_sptr->set_initial_calib(std::make_shared<mars::PoseSensorData>(pose_init_cal));

The imu-camera transform is from the sensor SDK, and is listed as:

camera_imu_transform | Transform between IMU and Left Camera frames.

Do I need this line:

pose_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();
(it is still from the example code)

Additionally, if the markers are lost and re-found, I see this error:

Warning: Could not perform Sensor update. No corresponding prior sensor state in buffer
What causes this?

Thank you again!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Hi @antithing,

It's great to hear that the changes improved the initialization result!

Before I go into detail on the values when moving the camera, two things:

  1. Can you ensure you are using the latest main or development branch? If you are using either the output when creating the core instance, it should be
  Created: [IMU] Sensor
  Created: [Pose] Sensor
  Created: Buffer (Size=300)
  Created: Buffer (Size=100)
- Created: CoreLogic
+ Created: Core Logic - Using MaRS Version: <BRANCH_HASH>

See also here:

std::cout << "Created: CoreLogic - Using MaRS Version: " << mars::Utils::get_mars_version_string() << std::endl;

2. Once verified, can you redo this test, as this error should not occur in the latest version anymore

Additionally, if the markers are lost and re-found, I see this error:

Warning: Could not perform Sensor update. No corresponding prior sensor state in buffer


Further, I thought the initial pose you set for MaRS is a bit far away (121 meters in z?).

p_wi:   [  -6.3754000681 -22.7880870687 121.1385729168 ]

So, I double-checked your code and saw that you do the following:

std::vector<cv::Vec3d> rvecs, tvecs;
cv::aruco::estimatePoseSingleMarkers(markerCorners, 17.2, K, D, rvecs, tvecs);

However, the output for the pose here is in the same unit as the markerLength input value of cv::aruco::estimatePoseSingleMarkers, as written in the docs:

markerLength the length of the markers' side. The returning translation vectors will be in the same unit. Normally, unit is meters.

Assuming the markerLength you currently use is in cm, the estimated pose you receive will also be in cm. However, MaRS expects any input to be metric, i.e., in meters -- m.

Thus, can you try to set the markerLength to 0.172 and see if that improves the performance?

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Hey @Chris-Bee, okay! Changes made, new pull from main branch. The lost-marker error is indeed resolved, thanks!

New output still seems problematic though:

Camera Model: ZED 2i, Serial Number: 38306361, Camera Firmware: 1523, Sensors Firmware: 777
Created: [IMU] Sensor
Created: [Pose] Sensor
Created: Buffer (Size=300)
Created: Buffer (Size=100)
Created: CoreLogic - Using MaRS Version: 0.1.0-8161e7aa6d0d

Connected to hardware, streaming data.
Warning: Core is not initialized yet. Measurement is stored but not processed
-0.0929553 POSE
States Initialized:
p_wi:   [ -0.0929553377 -0.2323435677  1.2029085753 ]
v_wi:   [ 0.0000000000 0.0000000000 0.0000000000 ]
q_wi:   [ 0.0700061419 0.0012553460 0.9975157926 0.0077335481 ]
b_w:    [ 0.0000000000 0.0000000000 0.0000000000 ]
b_a:    [ 0.0000000000 0.0000000000 0.0000000000 ]
w_m:    [  0.0045912107 -0.0013058792  0.0020951742 ]
a_m:    [ -0.1911114603 -9.8180751801 -0.1180527732 ]

Info: Filter was initialized
Info: Initialized [Pose] with [Given] Calibration at t=1695129008.5409367085
-0.0929553377 FILTER
-0.0929876453 POSE
-0.0701389220 FILTER
-0.0930495804 POSE
-0.0688917059 FILTER
-0.0929834637 POSE
-0.0697809185 FILTER
-0.0929194142 POSE
-0.0747732973 FILTER
-0.0930991240 POSE
-0.0705965920 FILTER
-0.0930236862 POSE
-0.0821505898 FILTER
-0.0930827459 POSE
-0.0896318680 FILTER
-0.0929486741 POSE
-0.0636392757 FILTER
-0.0928311993 POSE
-0.0436697568 FILTER
-0.0928837395 POSE
-0.0002536610 FILTER
-0.0929187783 POSE
0.0297956080 FILTER
-0.0930283658 POSE
0.0463810265 FILTER
-0.0928833491 POSE
0.0499581176 FILTER
-0.0929205550 POSE
0.0515787452 FILTER
-0.0929882825 POSE
0.0561162599 FILTER
-0.0929620811 POSE
0.0632703401 FILTER
-0.0929829785 POSE
0.0691110482 FILTER
-0.0929387500 POSE
0.0753617116 FILTER
-0.0929224833 POSE
0.0813199839 FILTER
-0.0931324002 POSE
0.0867027593 FILTER
-0.0929873475 POSE
0.0909435659 FILTER
-0.0928573511 POSE

It is better! But still some 'drift' in the filtered values. Is this normal?

Also, the imu-camera transform, is it possible I have it inverted? Should it be IMU-to-Camera? or Camera-to-IMU?
What I have is: 'IMU to Left camera transform matrix, that contains rotation and translation between IMU frame and camera frame.' from the Zed SDK, and I am assigning it as such:


   double imuLeftX = sensorConfig.camera_imu_transform.getTranslation().x;
   double imuLeftY = sensorConfig.camera_imu_transform.getTranslation().y;
   double imuLeftZ = sensorConfig.camera_imu_transform.getTranslation().z;

   double imuLeftQX = sensorConfig.camera_imu_transform.getOrientation().x;
   double imuLeftQY = sensorConfig.camera_imu_transform.getOrientation().y;
   double imuLeftQZ = sensorConfig.camera_imu_transform.getOrientation().z;
   double imuLeftQW = sensorConfig.camera_imu_transform.getOrientation().w;

    mars::PoseSensorData pose_init_cal;
    pose_init_cal.state_.p_ip_ = Eigen::Vector3d(imuLeftX, imuLeftY, imuLeftZ);
    pose_init_cal.state_.q_ip_ = Eigen::Quaterniond(imuLeftQW, imuLeftQX, imuLeftQY, imuLeftQZ);

Thanks again!

from mars_lib.

antithing avatar antithing commented on May 18, 2024

...more debugging notes, when i reproject the marker corners using the filter pose, it looks rotated, making me think I have an error somewhere in the way I am passing data.

Using:

       Eigen::Vector3d position(latest_core_state.p_wi_);
      Eigen::Quaterniond quaternion(latest_core_state.q_wi_);

       cv::Vec3d tvec(position.x(), position.y(), position.z());
       Eigen::Matrix3d mat = quaternion.toRotationMatrix();
       cv::Mat R;
       eigen2cv(mat,R);                      
        cv::Vec3d rvec;
         cv::Rodrigues(R, rvec);

to pass into cv::projectpoints function:

void repro(cv::Mat frame, cv::Vec3d tvec, cv::Vec3d rvec, std::string name)
{
    double markerLength = 0.172;
    cv::Mat objPoints(4, 1, CV_32FC3);
    objPoints.ptr<Vec3f>(0)[0] = Vec3f(-markerLength / 2.f, markerLength / 2.f, 0);
    objPoints.ptr<Vec3f>(0)[1] = Vec3f(markerLength / 2.f, markerLength / 2.f, 0);
    objPoints.ptr<Vec3f>(0)[2] = Vec3f(markerLength / 2.f, -markerLength / 2.f, 0);
    objPoints.ptr<Vec3f>(0)[3] = Vec3f(-markerLength / 2.f, -markerLength / 2.f, 0);

    std::vector<cv::Point2f> pnts;
    cv::projectPoints(objPoints, rvec, tvec, K, D, pnts);

    for (int i = 0; i < pnts.size(); i++)
    {
        cv::drawMarker(frame, pnts[i], cv::Scalar(255, 0, 0), cv::MARKER_CROSS);
    }

    cv::imshow(name, frame);
    cv::waitKey(1);


}

, gives the following:

The pose from the marker thread:

image

the pose from the IMU / filter thread: (when camera is moved, they 'float' around the image, not sticking on the marker at all.)

image

The corners look like they are rotated 90 degrees. What might I be doing wrong to cause this? Could it be a coordinate space mismatch somewhere?

Thank you yet again!

from mars_lib.

Chris-Bee avatar Chris-Bee commented on May 18, 2024

@antithing I'm happy the solution to this issue is on a good way. The credit for this support goes to my colleague @mascheiber.

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Ah apologies for the incorrect tag! @mascheiber, thank you very much! Getting closer!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Hi @antithing,

no worries, and thanks @Chris-Bee, for pointing this out.

@antithing, thanks so much for the detailed debug information, especially the pictures. This really helps to figure out the problem you are having.

The corners look like they are rotated 90 degrees. What might I be doing wrong to cause this? Could it be a coordinate space mismatch somewhere?

This might be the remaining issue you are having. Let me explain this with the following figure:

response_09_fig
(please stay with me with the frame naming convention; it will all make sense in a second)

MaRS generally estimates the IMU pose, denoted with frame $I$, within a gravity-aligned (ENU) world frame, denoted with $W$.
You then have rigidly attached to this IMU a pose sensor (i.e., your camera), denoted with $C$. Finally, your pose estimator measures some pose with respect to an arbitrary "pose (world) frame" denoted with $V$, in your example, the aruco tag.

Now, the aruco tag has a defined frame definition, which is given your pictures, $x$ to the right, $y$ upwards, and $z$ pointing out of the picture.
Would you have placed the aruco tag on the floor, the $z$ axis of the $V$ frame and $W$ frame (i.e., gravity) would have aligned, and everything should have worked as expected.

However, since you placed it on the wall, there is a rotational offset between these two frames (in the figure $\mathbf{R}_{WV}$). This has to be taken into account for each measurement you receive.
Currently, the pose sensor expects the pose measurement to be the position and orientation from $W$ to $C$, i.e., $\mathbf{p}_{WC}$ and $\mathbf{R}_{WC}$ respectively. In your setup, you are, however, measuring the pose from the aruco frame $V$ to the camera frame $C$, i.e., $\mathbf{p}_{VC}$ and $\mathbf{R}_{VC}$. Thus, a transform has to be applied somewhere in your whole setup that does the following:

$$\begin{align*} y_R &= \mathbf{R}_{WV} \cdot \mathbf{R}_{VC} \\\ y_p &= \mathbf{R}_{WV} \mathbf{p}_{VC} + \mathbf{p}_{VW} \end{align*}$$

where $\mathbf{p}_{VW}$ could be set to zero, and $\mathbf{R}_{VC}$ and $\mathbf{p}_{VC}$ are the current rvecs and tvecs from cv::aruco::estimatePoseSingleMarkers.

Further given your setup, $\mathbf{R}_{WV}$ could be fixed to

$$\begin{align*} \mathbf{R}_{WV} &= \begin{bmatrix} 1 & 0 & 0 \\\ 0 & 0 & -1 \\\ 0 & 1 & 0 \end{bmatrix} \end{align*}$$

which corresponds to a 90-degree rotation around the $x$-axis.

At this stage, you have two options:

  1. (not recommended) Apply the above calculations to any measurements you get before feeding it to the pose sensor as a measurement.

  2. (recommended) Switch to the VisionSensorClass, which already incorporates this MaRS world $W$ to aruco frame (vision frame) $V$ state.
    In this scenario, you would need to set this calibration and give it a very low state covariance, such that it is not estimated and updated while running MaRS.

// setup additional sensors
// Pose sensor
std::shared_ptr<mars::VisionSensorClass> vision_sensor_sptr = std::make_shared<mars::VisionSensorClass>("Vision", core_states_sptr);
vision_sensor_sptr->const_ref_to_nav_ = true;  // TODO is set here for now but will be managed by core logic in later versions

// Define measurement noise
Eigen::Matrix<double, 6, 1> pose_meas_std;
pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
vision_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);

// Define initial calibration and covariance
mars::VisionSensorData vision_init_cal;
vision_init_cal.state_.p_ic_ = Eigen::Vector3d(imuLeftX, imuLeftY, imuLeftZ);
vision_init_cal.state_.q_ic_ = Eigen::Quaterniond(imuLeftQW, imuLeftQX, imuLeftQY, imuLeftQZ);

// Define initial offset between vision world and MaRS world
Eigen::Matrix3d R_wv;
R_wv << 1, 0, 0,
        0, 0, -1,
        0, 1, 0;
vision_init_cal.state_.p_vw_ = Eigen::Vector3d::Zero();
vision_init_cal.state_.q_vw_ = Eigen::Quaterniond(R_wv.transpose()); // invert R_wv to get R_vw

// set lamda (scale) to 1 - as output is metric
vision_init_cal.state_.lambda_ = 1;

// The covariance should enclose the initialization with a 3 Sigma bound
Eigen::Matrix<double, 13, 1> std;
std << 1e-3, 1e-3, 1e-3, 1e-3, 1e-3, 1e-3, 0.1, 0.1, 0.1, (10 * M_PI / 180), (10 * M_PI / 180), (10 * M_PI / 180), 1e-3;
vision_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();

vision_sensor_sptr->set_initial_calib(std::make_shared<mars::VisionSensorData>(vision_init_cal));

Note that for the vision sensor, the frame naming convention is exactly as in the figure described above, though the state is inversed, i.e., aruco $V$ to MaRS world frame $W$.

These are some examples of which changes you need to apply, though I might have forgotten something -- please refer to an example for the VisionSensorClass here:
https://github.com/aau-cns/mars_ros/blob/6f7d0109569bcf8f2b1e590662035959c10fed20/src/mars_wrapper_gps_vision.cpp#L80-L90
and
https://github.com/aau-cns/mars_ros/blob/6f7d0109569bcf8f2b1e590662035959c10fed20/src/mars_wrapper_gps_vision.cpp#L643-L673

Please let me know if this information helps you, and please do not hesitate to ask if you have any further questions regarding this!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

What I forgot to mention above:

Regardless of your choice, you also need to initialize p_wi_init and q_wi_init in the same manner, using the $\mathbf{R}_{WV}$ and $\mathbf{p}_{WV}$ calibration. I.e., in pseudo-code

// when you first get a pose with tvecs and rvecs
Eigen::Matrix3d R_vc = rvecs; // or however you convert this
Eigen::Vector3d p_vc = tvecs; // or however you convert this

// try to retrieve the information stored in the vision_init_cal above
Eigen::Matrix3d R_ic = vision_init_cal.state_.q_ic_.toRotationMatrix();
Eigen::Vector3d p_ic = vision_init_cal.state_.p_ic_;
Eigen::Matrix3d R_vw = vision_init_cal.state_.q_vw_.toRotationMatrix();
Eigen::Vector3d p_vw = vision_init_cal.state_.p_vw_;

q_wi_init_ = Eigen::Quaterniond(R_vw.transpose() * R_vc * R_ic.transpose());
p_wi_init_ = R_vw.transpose()* (p_vc + R_vc*(-R_ic.transpose()*p_ic)) + p_vw;

from mars_lib.

antithing avatar antithing commented on May 18, 2024

@mascheiber Thank you for the excellent explanation. This is amazing support, and it is much appreciated!

Before I test the vision sensor changes above, in the interest of confirming there is nothing else strange happening, I have also tested with the pose sensor, and putting the marker on the floor.

Although I see a better alignment initially, as soon as the camera is moved, the points drift off in the wrong direction.

Please see the video below:

Mars_poseSensor.mov

Is this likely a secondary issue to the previous one? I am adding the latest full code here again in case that helps.

Thank you again!

main.zip

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Testing this on the floor first is a good idea!

So given your code and what I see, can you try two things:

  1. Can you set the covariance of the calibration states lower, such that these are barely updated?
// Line 351-354 in your main.cpp

// The covariance should enclose the initialization with a 3 Sigma bound
Eigen::Matrix<double, 6, 6> pose_cov;
pose_cov.setZero();
pose_cov.diagonal() << 1e-9, 1e-9, 1e-9, 1e-9, 1e-9, 1e-9;
pose_init_cal.sensor_cov_ = pose_cov;
  1. When you reproject the points with the MaRS pose (i.e. white marks), you are not using the same states as when you reproject them using the measurements. In more detail, in lines 459-460 you get the latest_core_state.p_wi_ and .q_wi_. These correspond to the red line of the figure above.
    However, what you measure (and need to use for reprojection) is the green line. So also in this case you need to apply the calibration transform
$$\begin{align*} y_R &= \mathbf{R}_{WI} \cdot \mathbf{R}_{IC} \\\ y_p &= \mathbf{R}_{WI} \mathbf{p}_{IC} + \mathbf{p}_{WI} \end{align*}$$
// Line 453-460 in your main.cpp

// get latest core state
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_state(&latest_result);
mars::CoreStateType latest_core_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;

// get latest sensor state
mars::BufferEntryType latest_sensor_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_sensor_result);
mars::PoseSensorStateType pose_sensor_state = pose_sensor_sptr.get()->get_state(latest_sensor_result.data_.sensor_);

Eigen::Vector3d p_wi(latest_core_state.p_wi_);
Eigen::Matrix3d R_wi(latest_core_state.q_wi_.toRotationMatrix());
Eigen::Vector3d p_ic(pose_sensor_state.p_ic_);
Eigen::Matrix3d R_ic(pose_sensor_state.q_ic_.toRotationMatrix());

Eigen::Vector3d position(R_wi*p_ic + p_wi);
Eigen::Quaterniond quaternion(R_wi*R_ic)

// continue as you did

If these two changes, do not produce the desired result, could you please output one sample run to CSV files so I can have a better understanding of what is going on using all state information.
You can do this (anyway) by adding something similar to this:

// Store results in a csv file
if (k.sensor_ == core_logic.core_states_->propagation_sensor_)
{
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_state(&latest_result);
mars::CoreStateType last_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;
ofile_core << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
}
if (k.sensor_ == pose_sensor_sptr)
{
// Repropagation after an out of order update can cause the latest state to be different from the current update
// sensor. Using get_latest_sensor_handle_state is the safest option.
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result);
mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
ofile_pose << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
}


Remark on using vision sensor

The 2nd change above would also be needed for visualization when using the vision sensor, but the equations do change due to having the aditional aruco $V$ frame, this calibration has also be taken into account:

$$\begin{align*} y_R &= \mathbf{R}_{VW} \cdot \mathbf{R}_{WI} \cdot \mathbf{R}_{IC} \\\ y_p &= \mathbf{R}_{VW} \cdot (\mathbf{R}_{WI} \mathbf{p}_{IC} + \mathbf{p}_{WI}) + \mathbf{p}_{VW} \end{align*}$$

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Copying that code gives me an error, should these lines be?

Eigen::Vector3d p_ic(pose_sensor_state.**p_ip_**);
Eigen::Matrix3d R_ic(pose_sensor_state.**q_ip_**.toRotationMatrix());

If so, please see csv attached, and corresponding video below.

Mars_poseSensor_csv.mov

mars_core_state.csv

Thanks!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Copying that code gives me an error, should these lines be?

Ah yes, apologies for the typo there! Mixed up the variable naming between the VisionSensorClass and PoseSensorClass.

If so, please see csv attached, and corresponding video below.

Alright so they are still moving (more than expected).
Thus thank you for the csv.

Would it be possible to do another recording, and also include a csv for the pose sensor states, as in mars_thl_example?

mars::BufferEntryType latest_result; 
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result); 
mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_); 
ofile_pose << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl; 

from mars_lib.

antithing avatar antithing commented on May 18, 2024

mars_core_state.csv
mars_pose_state.csv

Absolutely. Both csv's attached! Let me know if you want a video as well and I will get another dataset over to you.

Thanks!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Thank you @antithing,

this data really helps to see a bit in more detail what is going on.
In general, it looks like MaRS is following the motion you are doing, but accumulating a drift along the way, i.e., the marker reprojection difference. We are currently preparing a new commit to help you debug this further.

In the meantime, I have noticed that the biases grow exceptionally large when you move:
image

You do initialize the IMU measurment noise, and random walk behavior with the standard values, as far as I saw:

//mars lib stuff:
std::vector<double> imu_n_w{ 0.013, 0.013, 0.013 };
std::vector<double> imu_n_bw{ 0.0013, 0.0013, 0.0013 };
std::vector<double> imu_n_a{ 0.083, 0.083, 0.083 };
std::vector<double> imu_n_ba{ 0.0083, 0.0083, 0.0083 };

However, since it seems like the ZED Cam is providing these values also through its API (randomWalk - imu_n_b<x> and noiseDensity - imu_n_<x>), could you maybe try to initialize MaRS with these?

What is more, you are currently using the modified IMU data from the ZED cam, which by its doc should already compensate the bias. Could you maybe also give the uncalibrated data a try, to see if that changes the behavior? MaRS is most likley trying to perform the same bias estimation as the cameras api!

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Hi @mascheiber ! Copy that. Just so I am clear, I have this:

     sl::SensorParameters& sensor_parametersAccel = info.sensors_configuration.accelerometer_parameters;
     sl::SensorParameters& sensor_parametersGyro = info.sensors_configuration.gyroscope_parameters;

     float gyroNoise = sensor_parametersGyro.noise_density;
     float accelNoise = sensor_parametersAccel.noise_density;
     float gyroWalk = sensor_parametersGyro.random_walk;
     float accelWalk = sensor_parametersAccel.random_walk;

Can you please double confirm how these values map to the below matrix?

    std::vector<double> imu_n_w{ 0.013, 0.013, 0.013 };
    std::vector<double> imu_n_bw{ 0.0013, 0.0013, 0.0013 };
    std::vector<double> imu_n_a{ 0.083, 0.083, 0.083 };
    std::vector<double> imu_n_ba{ 0.0083, 0.0083, 0.0083 };

Thank you!

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Yes, these values should map as you have written, and use them for all three axes, respectively!

For the angular velocity (imu_n_<>w) a conversion from degrees to radian might be required though, if I understood the zed cam docs correctly.

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Excellent. Like this?

std::vector<double> imu_n_w{ gyroNoise * (M_PI /180), gyroNoise * (M_PI / 180), gyroNoise * (M_PI / 180) };
  std::vector<double> imu_n_bw{ gyroWalk, gyroWalk, gyroWalk };
  std::vector<double> imu_n_a{ accelNoise * (M_PI / 180), accelNoise * (M_PI / 180), accelNoise * (M_PI / 180) };
  std::vector<double> imu_n_ba{ accelWalk, accelWalk, accelWalk };

from mars_lib.

antithing avatar antithing commented on May 18, 2024

..printing :

     std::cout << sensor_parametersAccel.sensor_unit << std::endl;
     std::cout << sensor_parametersGyro.sensor_unit << std::endl;

gives me:

    m/s_2
      deg / s

I assume this means I can remove the radians conversion, right?

    std::vector<double> imu_n_w{ gyroNoise, gyroNoise, gyroNoise };
    std::vector<double> imu_n_bw{ gyroWalk, gyroWalk, gyroWalk };
    std::vector<double> imu_n_a{ accelNoise, accelNoise, accelNoise };
    std::vector<double> imu_n_ba{ accelWalk, accelWalk, accelWalk };

from mars_lib.

antithing avatar antithing commented on May 18, 2024

These changes give the attached result, still seeing a lot of drift.

Could it be the imu-camera offset values as well? Would that cause this kind of drift?

Thank you yet again!

mars_core_state.csv
mars_pose_state.csv

2023-09-22_16-08-49.mov

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Hi @antithing,

Apologies for the delayed response.

Actually the conversion should be like this:

std::vector<double> imu_n_w{ gyroNoise * (M_PI /180), gyroNoise * (M_PI / 180), gyroNoise * (M_PI / 180) };
std::vector<double> imu_n_bw{ gyroWalk * (M_PI /180), gyroWalk* (M_PI /180), gyroWalk* (M_PI /180) };
std::vector<double> imu_n_a{ accelNoise, accelNoise, accelNoise};
std::vector<double> imu_n_ba{ accelWalk, accelWalk, accelWalk };

Which of these versions did you end up using in the latest video and csvs above, the one with degree to radian conversion or without? And would it be possible to print the noise/walk values once, just to know what the sensor's calibration is saying?

Could it be the imu-camera offset values as well? Would that cause this kind of drift?

In general, yes it could. However, the imu-camera offset is not changing at all (see mars_pose_state.csv), and makes sense given the documentation from the Zed cam (identity orientation and a minor x offset). Therefore, I do not think this is the cause of it, also since the difference between tag and projected corners keeps growing over time while the calibration stays static at the same value (as it is supposed to).


We just updated the measurement storing in one of our latest commits. Thus, could you checkout the latest development version with commit 64ace9a?

With this version, you now can also store the measurements within a CSV.

// setup filestream
std::ofstream ofile_meas_pose, ofile_meas_imu;
ofile_meas_pose.open("/tmp/mars_pose_meas.csv", std::ios::out);
ofile_meas_imu.open("/tmp/mars_imu_meas.csv", std::ios::out);
ofile_meas_pose << std::setprecision(17);
ofile_meas_imu << std::setprecision(17);

// thread for IMU callbacks and mars lib updates

// whenever a imu measurement is processed
core_logic.ProcessMeasurement(imu_sensor_sptr, timestamp, data);

// Retreive the last measurement and write it to the measurement file
mars::BufferEntryType latest_meas;
core_logic.buffer_.get_latest_sensor_handle_measurement(imu_sensor_sptr, &latest_meas);
mars::IMUMeasurementType* last_meas = static_cast<mars::IMUMeasurementType*>(latest_meas.data_.sensor_.get());
ofile_meas_imu << last_meas->to_csv_string(k.timestamp_.get_seconds()) << std::endl;

// whenever a pose measurement is processed
// Retreive the last measurement and write it to the measurement file
if (frame == true)
{
  // other stuff
  // ...
  core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, dataP);

  mars::BufferEntryType latest_meas;
  core_logic.buffer_.get_latest_sensor_handle_measurement(pose_sensor_sptr, &latest_meas);
  mars::PoseMeasurementType* last_meas = static_cast<mars::PoseMeasurementType*>(latest_meas.data_.sensor_.get());
  ofile_meas_pose << last_meas->to_csv_string(k.timestamp_.get_seconds()) << std::endl;
}

Could you possibly do another recording and csv file creation by generating these csv files? Then, I would have everything to reproduce your setup and scenario on my end.

Further, would you mind sharing your latest main.cpp code or even setup a repository? I just want to double-check for reproducing your setup on my side.

from mars_lib.

antithing avatar antithing commented on May 18, 2024

Hi @mascheiber all attached here! Thank you yet again for your incredible support on this.

printing:

   float gyroNoise = sensor_parametersGyro.noise_density;
     float accelNoise = sensor_parametersAccel.noise_density;

     float gyroWalk = sensor_parametersGyro.random_walk;
     float accelWalk = sensor_parametersAccel.random_walk;

     std::cout << sensor_parametersAccel.sensor_unit << std::endl;
     std::cout << sensor_parametersGyro.sensor_unit << std::endl;

     std::cout << " gyroNoise: " << gyroNoise << " gyroWalk: " << gyroWalk << " accelNoise: " << accelNoise << " accelWalk: " << accelWalk << std::endl;

gives me:

m/s_2
deg/s
gyroNoise: 0.007 gyroWalk: 0.0019474 accelNoise: 0.0016 accelWalk: 0.0002509

mars_core_state.csv
mars_imu_meas.csv
mars_pose_meas.csv
mars_pose_state.csv
main.zip

Mars_test_25092023.mov

from mars_lib.

mascheiber avatar mascheiber commented on May 18, 2024

Thank you for these files, I think (and hope) now I figured out what is going on there!

Before I go into detail though, two important steps are still missing in your latest version for this to work:

#9 (comment)

2. When you reproject the points with the MaRS pose (i.e. white marks), you are not using the same states as when you reproject them using the measurements. In more detail, in lines 459-460 you get the `latest_core_state.p_wi_` and `.q_wi_`. These correspond to the red line of the figure above.
   However, what you measure (and need to use for reprojection) is the green line. So also in this case you need to apply the calibration transform
// Line 453-460 in your main.cpp

// get latest core state
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_state(&latest_result);
mars::CoreStateType latest_core_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;

// get latest sensor state
mars::BufferEntryType latest_sensor_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_sensor_result);
mars::PoseSensorStateType pose_sensor_state = pose_sensor_sptr.get()->get_state(latest_sensor_result.data_.sensor_);

Eigen::Vector3d p_wi(latest_core_state.p_wi_);
Eigen::Matrix3d R_wi(latest_core_state.q_wi_.toRotationMatrix());
Eigen::Vector3d p_ic(pose_sensor_state.p_ic_);
Eigen::Matrix3d R_ic(pose_sensor_state.q_ic_.toRotationMatrix());

Eigen::Vector3d position(R_wi*p_ic + p_wi);
Eigen::Quaterniond quaternion(R_wi*R_ic)

// continue as you did

This is still missing in your latest version. Place it before your current definitions for position,quaternion in line 564-565.

Furhter, the initialization steps mentioned in #9 (comment) also still need to be applied:

// when you first get a pose with tvecs and rvecs
Eigen::Matrix3d R_vc;
Eigen::Vector3d p_vc; 
ReadLock w_lock(grabberLock);
dataP = dataPose;
p_vc = pos;
R_vc = quat.toRotationMatrix();
frame2 = image;
w_lock.unlock();

// try to retrieve the information stored in the vision_init_cal above
Eigen::Matrix3d R_ic = pose_init_cal.state_.q_ip_.toRotationMatrix();
Eigen::Vector3d p_ic = pose_init_cal.state_.p_ip_;
Eigen::Matrix3d R_vw = Eigen::Matrix3d::Identity(); // vision: vision_init_cal.state_.q_vw_.toRotationMatrix();
Eigen::Vector3d p_vw = Eigen::Vector3d::Zero(); // vision: vision_init_cal.state_.p_vw_;

q_wi_init_ = Eigen::Quaterniond(R_vw.transpose() * R_vc * R_ic.transpose());
p_wi_init_ = R_vw.transpose()* (p_vc + R_vc*(-R_ic.transpose()*p_ic)) + p_vw;

if (!core_logic.core_is_initialized_) {
  // Initialize the first time at which the propagation sensor
  // occures
  if (imu_sensor_sptr ==
      core_logic.core_states_->propagation_sensor_) {
    core_logic.Initialize(p_wi_init, q_wi_init);
  } else {
    continue;
  }
}

Do this within lines 498-521.


With these changes, the reprojection error should come down quite a bit. Would be happy to see a video with these changes included!

If then it is still visible, it is due to the associated measurement noises between IMU and the aruco detection.
In general, any EKF (such as MaRS) will fuse the data given it's confidence in its current state and the measurement noise set in the beginning. Thus if your detection differs from what the IMU is saying, the result will be somewhere in between, averaged so to say.

To force MaRS to use the output of your aruco detection, you could lower the measurement noise of the pose sensor

Eigen::Matrix<double, 6, 1> pose_meas_std;
pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
pose_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);

However, by doing so you are limiting yourself and overruling the advantages of having a filtered output.
In that sense, there could always be an offset given your trajectory, because your measurements were not precise (neither IMU nor aruco detection). As such, MaRS will provide you the most probable pose (in mean and covariance) given the input, which might differ from the current measurement.

from mars_lib.

antithing avatar antithing commented on May 18, 2024

main.zip

2023-09-25_18-17-27.mov

Thanks! Changes made (new main.cpp attached, as well as video). Not much improvement unfortunately.

Lowering the measurement noise does help, but not enough to give a useable output.

Is there anything else I could be doing wrong here?

from mars_lib.

golbog avatar golbog commented on May 18, 2024

I'm quite interested in this as I'm facing the same issue.
Have you reached any conclusions on how to address this?

from mars_lib.

antithing avatar antithing commented on May 18, 2024

@golbog unfortunately not, despite the incredible support in this thread, I was unable to get this to a usable state.

Close! But not perfect.

from mars_lib.

Related Issues (12)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.