Sensor Fusion Using Synthetic Radar and Vision Data
This example shows how to generate a scenario, simulate sensor detections, and use sensor fusion to track simulated vehicles. The main benefit of using scenario generation and sensor simulation over sensor recording is the ability to create rare and potentially dangerous events and test the vehicle algorithms with them.
This example covers the entire programmatic workflow for generating synthetic data. To generate synthetic data interactively instead, use the Driving Scenario Designer app. For an example, see Create Driving Scenario Interactively and Generate Synthetic Sensor Data.
Generate the Scenario
Scenario generation comprises generating a road network, defining vehicles that move on the roads, and moving the vehicles.
In this example, you test the ability of the sensor fusion to track a vehicle that is passing on the left of the ego vehicle. The scenario simulates a highway setting, and additional vehicles are in front of and behind the ego vehicle.
% Define an empty scenario.
scenario = drivingScenario;
scenario.SampleTime = 0.01;
Add a stretch of 500 meters of typical highway road with two lanes. The road is defined using a set of points, where each point defines the center of the road in 3-D space. Add a Jersey barrier to the right edge of the road.
roadCenters = [0 0; 50 0; 100 0; 250 20; 500 40];
mainRoad = road(scenario, roadCenters, 'lanes',lanespec(2));
barrier(scenario,mainRoad);
Create the ego vehicle and three cars around it: one that overtakes the ego vehicle and passes it on the left, one that drives right in front of the ego vehicle and one that drives right behind the ego vehicle. All the cars follow the trajectory defined by the road waypoints by using the trajectory
driving policy. The passing car will start on the right lane, move to the left lane to pass, and return to the right lane.
% Create the ego vehicle that travels at 25 m/s along the road. Place the % vehicle on the right lane by subtracting off half a lane width (1.8 m) % from the centerline of the road. egoCar = vehicle(scenario, 'ClassID', 1); trajectory(egoCar, roadCenters(2:end,:) - [0 1.8], 25); % On right lane % Add a car in front of the ego vehicle leadCar = vehicle(scenario, 'ClassID', 1); trajectory(leadCar, [70 0; roadCenters(3:end,:)] - [0 1.8], 25); % On right lane % Add a car that travels at 35 m/s along the road and passes the ego vehicle passingCar = vehicle(scenario, 'ClassID', 1); waypoints = [0 -1.8; 50 1.8; 100 1.8; 250 21.8; 400 32.2; 500 38.2]; trajectory(passingCar, waypoints, 35); % Add a car behind the ego vehicle chaseCar = vehicle(scenario, 'ClassID', 1); trajectory(chaseCar, [25 0; roadCenters(2:end,:)] - [0 1.8], 25); % On right lane
Define Radar and Vision Sensors
In this example, you simulate an ego vehicle that has 6 radar sensors and 2 vision sensors covering the 360 degrees field of view. The sensors have some overlap and some coverage gap. The ego vehicle is equipped with a long-range radar sensor and a vision sensor on both the front and the back of the vehicle. Each side of the vehicle has two short-range radar sensors, each covering 90 degrees. One sensor on each side covers from the middle of the vehicle to the back. The other sensor on each side covers from the middle of the vehicle forward. The figure in the next section shows the coverage.
sensors = cell(8,1); % Front-facing long-range radar sensor at the center of the front bumper of the car. sensors{1} = drivingRadarDataGenerator('SensorIndex', 1, 'RangeLimits', [0 174], ... 'MountingLocation', [egoCar.Wheelbase + egoCar.FrontOverhang, 0, 0.2], 'FieldOfView', [20, 5]); % Rear-facing long-range radar sensor at the center of the rear bumper of the car. sensors{2} = drivingRadarDataGenerator('SensorIndex', 2, 'MountingAngles', [180 0 0], ... 'MountingLocation', [-egoCar.RearOverhang, 0, 0.2], 'RangeLimits', [0 30], 'FieldOfView', [20, 5]); % Rear-left-facing short-range radar sensor at the left rear wheel well of the car. sensors{3} = drivingRadarDataGenerator('SensorIndex', 3, 'MountingAngles', [120 0 0], ... 'MountingLocation', [0, egoCar.Width/2, 0.2], 'RangeLimits', [0 30], 'ReferenceRange', 50, ... 'FieldOfView', [90, 5], 'AzimuthResolution', 10, 'RangeResolution', 1.25); % Rear-right-facing short-range radar sensor at the right rear wheel well of the car. sensors{4} = drivingRadarDataGenerator('SensorIndex', 4, 'MountingAngles', [-120 0 0], ... 'MountingLocation', [0, -egoCar.Width/2, 0.2], 'RangeLimits', [0 30], 'ReferenceRange', 50, ... 'FieldOfView', [90, 5], 'AzimuthResolution', 10, 'RangeResolution', 1.25); % Front-left-facing short-range radar sensor at the left front wheel well of the car. sensors{5} = drivingRadarDataGenerator('SensorIndex', 5, 'MountingAngles', [60 0 0], ... 'MountingLocation', [egoCar.Wheelbase, egoCar.Width/2, 0.2], 'RangeLimits', [0 30], ... 'ReferenceRange', 50, 'FieldOfView', [90, 5], 'AzimuthResolution', 10, ... 'RangeResolution', 1.25); % Front-right-facing short-range radar sensor at the right front wheel well of the car. sensors{6} = drivingRadarDataGenerator('SensorIndex', 6, 'MountingAngles', [-60 0 0], ... 'MountingLocation', [egoCar.Wheelbase, -egoCar.Width/2, 0.2], 'RangeLimits', [0 30], ... 'ReferenceRange', 50, 'FieldOfView', [90, 5], 'AzimuthResolution', 10, ... 'RangeResolution', 1.25); % Front-facing camera located at front windshield. sensors{7} = visionDetectionGenerator('SensorIndex', 7, 'FalsePositivesPerImage', 0.1, ... 'SensorLocation', [0.75*egoCar.Wheelbase 0], 'Height', 1.1); % Rear-facing camera located at rear windshield. sensors{8} = visionDetectionGenerator('SensorIndex', 8, 'FalsePositivesPerImage', 0.1, ... 'SensorLocation', [0.2*egoCar.Wheelbase 0], 'Height', 1.1, 'Yaw', 180); % Register actor profiles with the sensors. profiles = actorProfiles(scenario); for m = 1:numel(sensors) if isa(sensors{m},'drivingRadarDataGenerator') sensors{m}.Profiles = profiles; else sensors{m}.ActorProfiles = profiles; end end
Create a Tracker
Create a
to track the vehicles that are close to the ego vehicle. The tracker uses the multiObjectTracker
initSimDemoFilter
supporting function to initialize a constant velocity linear Kalman filter that works with position and velocity.
Tracking is done in 2-D. Although the sensors return measurements in 3-D, the motion itself is confined to the horizontal plane, so there is no need to track the height.
tracker = multiObjectTracker('FilterInitializationFcn', @initSimDemoFilter, ... 'AssignmentThreshold', 30, 'ConfirmationThreshold', [4 5]); positionSelector = [1 0 0 0; 0 0 1 0]; % Position selector velocitySelector = [0 1 0 0; 0 0 0 1]; % Velocity selector % Create the display and return a handle to the bird's-eye plot BEP = createDemoDisplay(egoCar, sensors);
Simulate the Scenario
The following loop moves the vehicles, calls the sensor simulation, and performs the tracking.
Note that the scenario generation and sensor simulation can have different time steps. Specifying different time steps for the scenario and the sensors enables you to decouple the scenario simulation from the sensor simulation. This is useful for modeling actor motion with high accuracy independently from the sensor's measurement rate.
Another example is when the sensors have different update rates. Suppose one sensor provides updates every 20 milliseconds and another sensor provides updates every 50 milliseconds. You can specify the scenario with an update rate of 10 milliseconds and the sensors will provide their updates at the correct time.
In this example, the scenario generation has a time step of 0.01 second, while the sensors detect every 0.1 second. The sensors return a logical flag, isValidTime
, that is true if the sensors generated detections. This flag is used to call the tracker only when there are detections.
Another important note is that the sensors can simulate multiple detections per target, in particular when the targets are very close to the radar sensors. Because the tracker assumes a single detection per target from each sensor, you must cluster the detections before the tracker processes them. This is done by setting TargetReportFormat
to 'Clustered detections', which is the default. The sensor model may also output raw detection data, or track updates using an internal tracker.
toSnap = true; while advance(scenario) && ishghandle(BEP.Parent) % Get the scenario time time = scenario.SimulationTime; % Get the position of the other vehicle in ego vehicle coordinates ta = targetPoses(egoCar); % Simulate the sensors detectionClusters = {}; isValidTime = false(1,8); for i = 1:8 [sensorDets,numValidDets,isValidTime(i)] = sensors{i}(ta, time); if numValidDets for j = 1:numValidDets % Vision detections do not report SNR. The tracker requires % that they have the same object attributes as the radar % detections. This adds the SNR object attribute to vision % detections and sets it to a NaN. if ~isfield(sensorDets{j}.ObjectAttributes{1}, 'SNR') sensorDets{j}.ObjectAttributes{1}.SNR = NaN; end % Remove the Z-component of measured position and velocity % from the Measurement and MeasurementNoise fields sensorDets{j}.Measurement = sensorDets{j}.Measurement([1 2 4 5]); sensorDets{j}.MeasurementNoise = sensorDets{j}.MeasurementNoise([1 2 4 5],[1 2 4 5]); end detectionClusters = [detectionClusters; sensorDets]; %#ok<AGROW> end end % Update the tracker if there are new detections if any(isValidTime) if isa(sensors{1},'drivingRadarDataGenerator') vehicleLength = sensors{1}.Profiles.Length; else vehicleLength = sensors{1}.ActorProfiles.Length; end confirmedTracks = updateTracks(tracker, detectionClusters, time); % Update bird's-eye plot updateBEP(BEP, egoCar, detectionClusters, confirmedTracks, positionSelector, velocitySelector); end % Snap a figure for the document when the car passes the ego vehicle if ta(1).Position(1) > 0 && toSnap toSnap = false; snapnow end end
Summary
This example shows how to generate a scenario, simulate sensor detections, and use these detections to track moving vehicles around the ego vehicle.
You can try to modify the scenario road, or add or remove vehicles. You can also try to add, remove, or modify the sensors on the ego vehicle, or modify the tracker parameters.
Supporting Functions
initSimDemoFilter
This function initializes a constant velocity filter based on a detection.
function filter = initSimDemoFilter(detection) % Use a 2-D constant velocity model to initialize a trackingKF filter. % The state vector is [x;vx;y;vy] % The detection measurement vector is [x;y;vx;vy] % As a result, the measurement model is H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1] H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1]; filter = trackingKF('MotionModel', '2D Constant Velocity', ... 'State', H' * detection.Measurement, ... 'MeasurementModel', H, ... 'StateCovariance', H' * detection.MeasurementNoise * H, ... 'MeasurementNoise', detection.MeasurementNoise); end
createDemoDisplay
This function creates a three-panel display:
Top-left corner of display: A top view that follows the ego vehicle.
Bottom-left corner of display: A chase-camera view that follows the ego vehicle.
Right-half of display: A
display.birdsEyePlot
function BEP = createDemoDisplay(egoCar, sensors) % Make a figure hFigure = figure('Position', [0, 0, 1200, 640], 'Name', 'Sensor Fusion with Synthetic Data Example'); movegui(hFigure, [0 -1]); % Moves the figure to the left and a little down from the top % Add a car plot that follows the ego vehicle from behind hCarViewPanel = uipanel(hFigure, 'Position', [0 0 0.5 0.5], 'Title', 'Chase Camera View'); hCarPlot = axes(hCarViewPanel); chasePlot(egoCar, 'Parent', hCarPlot); % Add a car plot that follows the ego vehicle from a top view hTopViewPanel = uipanel(hFigure, 'Position', [0 0.5 0.5 0.5], 'Title', 'Top View'); hCarPlot = axes(hTopViewPanel); chasePlot(egoCar, 'Parent', hCarPlot, 'ViewHeight', 130, 'ViewLocation', [0 0], 'ViewPitch', 90); % Add a panel for a bird's-eye plot hBEVPanel = uipanel(hFigure, 'Position', [0.5 0 0.5 1], 'Title', 'Bird''s-Eye Plot'); % Create bird's-eye plot for the ego vehicle and sensor coverage hBEVPlot = axes(hBEVPanel); frontBackLim = 60; BEP = birdsEyePlot('Parent', hBEVPlot, 'Xlimits', [-frontBackLim frontBackLim], 'Ylimits', [-35 35]); % Plot the coverage areas for radars for i = 1:6 cap = coverageAreaPlotter(BEP,'FaceColor','red','EdgeColor','red'); if isa(sensors{i},'drivingRadarDataGenerator') plotCoverageArea(cap, sensors{i}.MountingLocation(1:2),... sensors{i}.RangeLimits(2), sensors{i}.MountingAngles(1), sensors{i}.FieldOfView(1)); else plotCoverageArea(cap, sensors{i}.SensorLocation,... sensors{i}.MaxRange, sensors{i}.Yaw, sensors{i}.FieldOfView(1)); end end % Plot the coverage areas for vision sensors for i = 7:8 cap = coverageAreaPlotter(BEP,'FaceColor','blue','EdgeColor','blue'); if isa(sensors{i},'drivingRadarDataGenerator') plotCoverageArea(cap, sensors{i}.MountingLocation(1:2),... sensors{i}.RangeLimits(2), sensors{i}.MountingAngles(1), 45); else plotCoverageArea(cap, sensors{i}.SensorLocation,... sensors{i}.MaxRange, sensors{i}.Yaw, 45); end end % Create a vision detection plotter put it in a struct for future use detectionPlotter(BEP, 'DisplayName','vision', 'MarkerEdgeColor','blue', 'Marker','^'); % Combine all radar detections into one entry and store it for later update detectionPlotter(BEP, 'DisplayName','radar', 'MarkerEdgeColor','red'); % Add road borders to plot laneMarkingPlotter(BEP, 'DisplayName','lane markings'); % Add the tracks to the bird's-eye plot. Show last 10 track updates. trackPlotter(BEP, 'DisplayName','track', 'HistoryDepth',10); axis(BEP.Parent, 'equal'); xlim(BEP.Parent, [-frontBackLim frontBackLim]); ylim(BEP.Parent, [-40 40]); % Add an outline plotter for ground truth outlinePlotter(BEP, 'Tag', 'Ground truth'); end
updateBEP
This function updates the bird's-eye plot with road boundaries, detections, and tracks.
function updateBEP(BEP, egoCar, detections, confirmedTracks, psel, vsel) % Update road boundaries and their display [lmv, lmf] = laneMarkingVertices(egoCar); plotLaneMarking(findPlotter(BEP,'DisplayName','lane markings'),lmv,lmf); % update ground truth data [position, yaw, length, width, originOffset, color] = targetOutlines(egoCar); plotOutline(findPlotter(BEP,'Tag','Ground truth'), position, yaw, length, width, 'OriginOffset', originOffset, 'Color', color); % update barrier data [bPosition,bYaw,bLength,bWidth,bOriginOffset,bColor,numBarrierSegments] = targetOutlines(egoCar, 'Barriers'); plotBarrierOutline(findPlotter(BEP,'Tag','Ground truth'),numBarrierSegments,bPosition,bYaw,bLength,bWidth,... 'OriginOffset',bOriginOffset,'Color',bColor); % Prepare and update detections display N = numel(detections); detPos = zeros(N,2); isRadar = true(N,1); for i = 1:N detPos(i,:) = detections{i}.Measurement(1:2)'; if detections{i}.SensorIndex > 6 % Vision detections isRadar(i) = false; end end plotDetection(findPlotter(BEP,'DisplayName','vision'), detPos(~isRadar,:)); plotDetection(findPlotter(BEP,'DisplayName','radar'), detPos(isRadar,:)); % Remove all object tracks that are unidentified by the vision detection % generators before updating the tracks display. These have the ObjectClassID % parameter value as 0 and include objects such as barriers. isNotBarrier = arrayfun(@(t)t.ObjectClassID,confirmedTracks)>0; confirmedTracks = confirmedTracks(isNotBarrier); % Prepare and update tracks display trackIDs = {confirmedTracks.TrackID}; labels = cellfun(@num2str, trackIDs, 'UniformOutput', false); [tracksPos, tracksCov] = getTrackPositions(confirmedTracks, psel); tracksVel = getTrackVelocities(confirmedTracks, vsel); plotTrack(findPlotter(BEP,'DisplayName','track'), tracksPos, tracksVel, tracksCov, labels); end
See Also
Apps
Objects
birdsEyePlot
|drivingRadarDataGenerator
|visionDetectionGenerator
|multiObjectTracker
|drivingScenario