Testing Robust Graph Optimization With Poor A-Priori Measurement Error Covariance
Overly Confident Optimizer
As an initial test, we will utilize the unmodified ( i.e, no false constraints added, and no additional noise added to the graph ) Manhattan 3500 dataset. To test the sensitivity of the optimization routine to the initial measurement error covariance, we provide the optimizer with a measurement error covariance that is much smaller ( $R_t * 1e^{-6}$ ) than the true distribution from which the errors are sampled. This will provide insight into the robustness of the algorithms to the $a-prior$ measurement error covariance.
First, we will test the switchable constraint methodology. This is depicted in Fig. 1, where it can be seen that the optimization did not provide an accurate pose-graph estimate.
Figure 1 :: Test Switchable Constraints When Poor Measurement Error Covariance is Provided
Next, we can look at the ability of the max-mixtures approach to handle the specified scenario. Again, it can seen that this optimization routine does not handle this scenario well, as depicted in the left-hand-side of Fig. 2. However, this method does perform slightly better than the switchable constraints method ( i.e., the structure of the graph is still present; however, the erroneous pose-estimates are corrupting the initial plot ), which can be seen by in the right-hand-side of Fig. 2.
Figure 2 :: Test (unmodified) Max-Mixtures When Poor Measurement Error Covariance is Provided
Finally, we can evaluate how the non-parametric clustering extension to max-mixtures performs. From Fig. 3, we can see that the specified optimization routine is robust to the poor initial measurement error covariance. This allowed the methodology to estimate the graph accurately.
Figure 3 :: Test D.P. Max-Mixtures When Poor Measurement Error Covariance is Provided