Main Content

trajectoryErrorMetrics

Store accuracy metrics for trajectories

Since R2024b

Description

The trajectoryErrorMetrics object stores accuracy metrics for trajectories, including both absolute trajectory error and relative trajectory error, across a sequence of poses.

Creation

Create a trajectoryErrorMetrics object by using the compareTrajectories function.

Properties

expand all

This property is read-only.

Absolute trajectory error, specified as an M-by-2 matrix of the form [absRotationError,absTranslationError]. M specifies the number of poses. absRotationError specifies the absolute rotation error in degrees and absTranslationError specifies the absolute translation error. The absRotationError is the magnitude of the rotation angle in degrees, obtained by converting the rotation matrix in AbsoluteError_i to the angle-axis representation. The absTranslationError is the Euclidean norm of the translation vector derived from AbsoluteError_i.

The absolute trajectory pose error calculates the exact differences between where a robot thinks it is (estimated pose), and its actual location (ground truth), at specific times. It quantifies how accurately the estimated path aligns with the actual (ground truth) path. Mathematically, if Piest is the i-th estimated pose and Pigt is the corresponding ground truth pose, then the absolute pose error of the i-th estimated pose can be defined as the:

AbsoluteErrori=PiestPigt

or as,

AbsoluteErrori=Pigt·(Piest)1

This property is read-only.

Relative trajectory pose error, specified as an M-by-2 matrix of the form [relRotationError,relTranslationError]. M specifies the number of poses. relRotationError specifies the relative rotation error in degrees and relTranslationError specifies the relative translation error.

The relRotationError represents the magnitude of the rotation angle, measured in degrees. This value is obtained after converting the rotation matrix in RelativeErrori+Δ to the angle-axis representation.

The relTranslationError is the Euclidean norm of the translation component extracted from RelativeErrori+Δ.

The relative pose error calculates the differences in the robot's movement over a set distance or interval, rather than its exact position at a specific point in time. It measures how much the robot's estimated change in position (estimated relative pose) deviates from the actual change in position (ground truth relative pose) over this interval. This helps in assessing the accuracy of the robot's perception of its own movement over time.

To compute relative pose error, compare pairs of poses (Piest,Pi+Δest) from the estimated trajectory (Pigt,Pi+Δgt) from the actual (ground truth) trajectory, spaced by a set distance.

The error is the difference in movement between these pairs.

RelativeErrori+Δ=(Piest)1Pi+Δest(Pigt)1Pi+Δgt

or,

RelativeErrori+Δ=(Pigt)1·Pi+Δgt·((Piest)1·Pi+Δest)1

This property is read-only.

Root mean square (RMS) of absolute trajectory error, specified as a 2-element vector of the form [absRotationRMS,absTranslationRMS]. The RMS is calculated as:

  • absRotationRMS = rms(absRotationRMS)

  • absTranslationRMS = rms(absTranslationRMS)

The absolute pose error can be summarized over the entire trajectory by taking the root mean square of the individual errors.

AbsoluteRMSE=1ni=1nAbsoluteErrori2

This property is read-only.

Root mean square of relative (RMS) trajectory error, specified as a 2-element vector of the form [relRotationRMS,relTranslationRMS]. The RMS is calculated as:

  • relRotationRMS = rms(relRotationRMS)

  • relTranslationRMS = rms(relTranslationRMS)

The relative pose error can be summarized over the entire trajectory by taking the root mean square of the individual errors.

RelativeRMSE=1ni=1nRelativeErrori2

Object Functions

plotPlot aligned estimated trajectory and error metrics

Examples

collapse all

Perform monocular visual simultaneous localization and mapping (vSLAM) using the data from the TUM RGB-D Benchmark. You can download the data to a temporary directory using a web browser or by running this code:

baseDownloadURL = "https://cvg.cit.tum.de/rgbd/dataset/freiburg3/rgbd_dataset_freiburg3_long_office_household.tgz"; 
dataFolder = fullfile(tempdir,"tum_rgbd_dataset",filesep); 
options = weboptions(Timeout=Inf);
tgzFileName = dataFolder+"fr3_office.tgz";
folderExists = exist(dataFolder,"dir");

% Create a folder in a temporary directory to save the downloaded file
if ~folderExists  
    mkdir(dataFolder) 
    disp("Downloading fr3_office.tgz (1.38 GB). This download can take a few minutes.") 
    websave(tgzFileName,baseDownloadURL,options); 
    
    % Extract contents of the downloaded file
    disp("Extracting fr3_office.tgz (1.38 GB) ...") 
    untar(tgzFileName,dataFolder); 
end

Create an imageDatastore object to store all the RGB images.

imageFolder = dataFolder+"rgbd_dataset_freiburg3_long_office_household/rgb/";
imds = imageDatastore(imageFolder);

Specify your camera intrinsic parameters, and use them to create a monocular visual SLAM object.

intrinsics = cameraIntrinsics([535.4 539.2],[320.1 247.6],[480 640]);
vslam = monovslam(intrinsics,TrackFeatureRange=[30,120]);

Process each image frame, and visualize the camera poses and 3-D map points. Note that the monovslam object runs several algorithm parts on separate threads, which can introduce a latency in processing of an image frame added by using the addFrame function.

for i = 1:numel(imds.Files)
    addFrame(vslam,readimage(imds,i))

    if hasNewKeyFrame(vslam)
        % Display 3-D map points and camera trajectory
        plot(vslam);
    end

    % Get current status of system
    status = checkStatus(vslam);
end 

Figure contains an axes object. The axes object with xlabel X, ylabel Y contains 12 objects of type line, text, patch, scatter. This object represents Camera trajectory.

Plot intermediate results and wait until all images are processed.

while ~isDone(vslam)
    if hasNewKeyFrame(vslam)
        plot(vslam);
    end
end

Figure contains an axes object. The axes object with xlabel X, ylabel Y contains 12 objects of type line, text, patch, scatter. This object represents Camera trajectory.

After all the images are processed, you can collect the final 3-D map points and camera poses for further analysis.

xyzPoints = mapPoints(vslam);
[camPoses,addedFramesIdx] = poses(vslam);

% Reset the system
reset(vslam)

Compare the estimated camera trajectory with the ground truth to evaluate the accuracy.

% Load ground truth
gTruthData = load("orbslamGroundTruth.mat");
gTruth     = gTruthData.gTruth;

% Evaluate tracking accuracy
mtrics = compareTrajectories(camPoses, gTruth(addedFramesIdx), AlignmentType="similarity");
disp(['Absolute RMSE for key frame location (m): ', num2str(mtrics.AbsoluteRMSE(2))]);
Absolute RMSE for key frame location (m): 0.093645
% Plot the absolute translation error at each key frame
figure
ax = plot(mtrics, "absolute-translation");
view(ax, [2.70 -49.20]); 

Figure contains an axes object. The axes object with title Absolute Translation Error, xlabel X, ylabel Y contains 2 objects of type patch, line. These objects represent Estimated Trajectory, Ground Truth Trajectory.

References

[1] Sturm, Jürgen, Nikolas Engelhard, Felix Endres, Wolfram Burgard, and Daniel Cremers. 2012. "A Benchmark for the Evaluation of RGB-D SLAM Systems." In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 573-580.

[2] Zhang, Zichao, and Davide Scaramuzza. 2018. "A Tutorial on Quantitative Trajectory Evaluation for Visual (-Inertial) Odometry." In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 7244-7251.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

Version History

Introduced in R2024b