marineroboticsgroup / dcsam Goto Github PK
View Code? Open in Web Editor NEWFactored inference for discrete-continuous smoothing and mapping.
Home Page: https://arxiv.org/abs/2204.11936
License: MIT License
Factored inference for discrete-continuous smoothing and mapping.
Home Page: https://arxiv.org/abs/2204.11936
License: MIT License
Current (naive) update implementation requires ~O(M) measurement updates. Simply caching the indices of DCFactors within the discrete/continuous graphs (or iSAM2 in the latter case) will be a huge performance boost; more like ~O(K), K << M, where K is the number of DCFactors only (vs. M total factors).
After adding uniform priors on inactive discrete keys in this commit, KITTI results are consistently worse than the old ones on seq 5.
EVO ape Before the commit:
APE w.r.t. translation part (m)
(with SE(3) Umeyama alignment) (aligned poses: 30)
max 7.931835
mean 2.841607
median 2.044358
min 0.249557
rmse 3.547685
sse 1736.877678
std 2.123991
After the commit:
APE w.r.t. translation part (m)
(with SE(3) Umeyama alignment) (aligned poses: 30)
max 14.252595
mean 5.990570
median 5.300368
min 0.301529
rmse 7.317135
sse 7388.584812
std 4.201612
Printing out the decision tree factor with min error gives something like this:
Factor x63 l17
DecisionTreeFactor:
Potentials:
Cardinalities: {c17:1, }
Leaf 1
Factor x61 l18
DecisionTreeFactor:
Potentials:
Cardinalities: {c16:1, c17:1, c18:1, c19:1, }
Leaf 1
Hey guys,
I think this is meant to be if(!normalized)
https://github.com/MarineRoboticsGroup/dcsam/blob/main/include/dcsam/DCMixtureFactor.h#L68
https://github.com/MarineRoboticsGroup/dcsam/blob/main/include/dcsam/DCMaxMixtureFactor.h#L78
Best,
Pat
This is currently a WIP on the docs branch.
Per #22 this is particularly relevant when a user does not need to do a batch solve of discrete variables. Until we have proper incremental (hybrid) solves, users should be able to specify that only fresh continuous assignments are needed (e.g. in the event that only odometry measurements are added to a factor graph). The default behavior now on main
performs a heuristic check for odometry-like factors and does not perform a discrete solve when only these factors are added, but this could lead to odd behavior for certain graph inputs.
Jenkins CI is currently down due to a power cycle at CSAIL over the past few weeks.
Need to either reboot mrg-beast
at CSAIL, or probably the more sustainable future solution is to migrate CI fully to Github.
When we ran sparse one-class (car) kitti SLAM with max-mixtures for data association using the following parameters, we encountered a segfault at around the first big loop closure (in the middle of the test).
<!-- Arguments -->
<arg name="kitti_path" default="/media/ziqi/LENOVO_USB_HDD/data/kitti/05/"/>
<!-- Data association algorithm {0: ML; 1: MM; 2: SM; 3: EM} -->
<arg name="DA_type" default="1"/>
<arg name="noise_gain" default="0"/>
<arg name="misclassification_rate" default="0"/>
...
<!-- VISO2 parameters -->
<param name="ref_frame_change_method" value="0"/> <!-- choose from 0,1,2, try 2 first if 0 fails-->
<param name="viso2_small_motion_threshold" value="5.0"/>
<param name="viso2_inlier_threshold" value="90"/> <!-- must be integer-->
<!-- Block Matching parameters (for point cloud computation) -->
<param name="minDisparities" value="0"/>
<param name="numDisparities" value="8"/> <!-- will be multiplied by 16-->
<param name="blockSize" value="5"/>
...
<!-- Keyframe time threshold [nsec] -->
<param name="kf_threshold" value="2e9"/>
<!-- Noise models -->
<rosparam param="odom_noise_model"> [0.02, 0.02, 0.02, 0.02, 0.02, 0.02] </rosparam>
<rosparam param="det_noise_model"> [0.2, 0.2, 1.0] </rosparam>
<!-- Null-Hypo weight -->
<param name="nh_weight" value="0.1" />
<!-- Object measurement gating params -->
<param name="search_radius" value="20.0"/>
<param name="maha_dist_thresh" value="4.0"/>
...
The gdb back-trace log is pasted here:
0x00007ffff5b0d07a in gtsam::EliminateDiscrete(gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&) ()
from /usr/local/lib/libgtsam.so.4
(gdb) backtrace
#0 0x00007ffff5b0d07a in gtsam::EliminateDiscrete(gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&) ()
at /usr/local/lib/libgtsam.so.4
#1 0x0000555555789d3a in gtsam::EliminationTraits<gtsam::DiscreteFactorGraph>::DefaultEliminate(gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&) ()
#2 0x000055555578ed69 in boost::detail::function::function_invoker2<std::pair<boost::shared_ptr<gtsam::DiscreteConditional>, boost::shared_ptr<gtsam::DiscreteFactor> > (*)(gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&), std::pair<boost::shared_ptr<gtsam::DiscreteConditional>, boost::shared_ptr<gtsam::DiscreteFactor> >, gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&>::invoke(boost::detail::function::function_buffer&, gtsam::DiscreteFactorGraph const&, gtsam::Ordering const&) ()
#3 0x00007ffff5b15e46 in gtsam::EliminationData<gtsam::EliminatableClusterTree<gtsam::DiscreteBayesTree, gtsam::DiscreteFactorGraph> >::EliminationPostOrderVisitor::operator()(boost::shared_ptr<gtsam::ClusterTree<gtsam::DiscreteFactorGraph>::Cluster> const&, gtsam::EliminationData<gtsam::EliminatableClusterTree<gtsam::DiscreteBayesTree, gtsam::DiscreteFactorGraph> >&) ()
After adding some printing statements, we found that this is related to the getMarginals function in DCSAM.
And we don't see this error using maximum likelihood based data association in max-mixtures.
We temporarily solved this in max-mixtures by only querying marginals for the continuous graph (Originally we are doing this).
GTSAM @ caa14bc does not seem to have the Potentials.h file that we want to include:
/home/tonio/repos/idbt/include/idbt/max_product.h:18:10: fatal error: gtsam/discrete/Potentials.h: No such file or directory
#include <gtsam/discrete/Potentials.h>
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
Doxygen setup right now is not super nice (not well formatted or linked with GTSAM docs).
If we can install GTSAM on the Docker image that builds the documentation, Doxygen should be able to figure out inherited types from GTSAM.
Also might consider playing with Doxygen configs to make the documentation generally a little "nicer".
Looks like you accidentally cast your index to double:
https://github.com/MarineRoboticsGroup/dcsam/blob/main/include/dcsam/DCMaxMixtureFactor.h#L120
https://github.com/MarineRoboticsGroup/dcsam/blob/main/include/dcsam/DCMaxMixtureFactor.h#L127
https://github.com/MarineRoboticsGroup/dcsam/blob/main/include/dcsam/DCMaxMixtureFactor.h#L135
Assuming that's not on purpose?
Cheers,
Pat
The DCFactor::evalProbs( ...)
function implements "exp-normalization" where we attempt to normalize a set of (negative) log probabilities as exp(log p_i) / ( sum_i exp(log p_i) )
. For exceptionally small values of log p_i
, e.g. -10^6, this expression is susceptible to underflow. This can occur, for example, if the "continuous part" of a DCFactor for a particular discrete assignment has large error, even independent of the "discrete part." Rather than compute this expression naively as written, we should "shift" the exponents prior to normalizing to avoid numerical issues.
cc: @kurransingh
It appears that there is a minor bug in the computation of a DCFactor
normalization constant.
dcsam/include/dcsam/DCFactor.h
Lines 228 to 230 in 92af419
Specifically, the sign of the d/2 log(2 * pi)
term. It would be worth it to double check my maths, but I derive that that term is positive...
While this does not affect the maximal component, it would cause issues if one assumes that errors are properly normalized.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.