This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Automated Driving Toolbox

Design, simulate, and test ADAS and autonomous driving systems

Automated Driving Toolbox™ provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and OpenDRIVE® road networks.

Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) and desktop simulation of sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios and radar and camera sensor outputs.

Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including FCW, AEB, ACC, LKA, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.

Getting Started

Learn the basics of Automated Driving Toolbox

Sensor Configuration and Coordinate System Transforms

Camera sensor configuration, image-to-vehicle coordinate system transform, bird’s-eye-view image transform

Ground Truth Labeling

Interactive ground truth labeling for object detection, semantic segmentation, and image classification

Perception with Computer Vision and Lidar

Object and lane boundary detections using machine learning and deep learning, lidar processing

Tracking and Sensor Fusion

Object tracking and multisensor fusion, bird’s-eye plot of detections and object tracks

Driving Scenario Generation and Sensor Models

Test automated driving algorithms using authored scenarios and synthetic detections from radar and camera sensor models

Planning, Mapping, and Control

Path planning, costmaps, geographic map display, vehicle controllers