Navigation

SAFE
Blog-a-safer-scenario-FI
The Ford Palo Alto Research and Innovation Center team developed a virtual test environment based on gaming software, called aDRIVE (for Autonomous Driving Refined in Virtual Environments), that will test algorithms such as traffic sign recognition in dynamic driving situations.
A Safer Scenario for Autonomous Driving and Active Safety Testing

As the automotive industry progresses toward autonomy, the need for simulation-based development and validation increases, as does the need for greater detail and volume in simulations. Full autonomy requires an unprecedented amount of trust placed in the vehicle’s systems to safely handle a broad range of scenarios, and such trust requires extensive testing. Estimates are on the order of 100 million km and several hundred million euros for validation of autonomous systems using road tests alone. These estimates, along with the dangers associated with testing specific scenarios, further motivates the use of simulation.

The systems to be simulated also go beyond vehicle dynamics alone, requiring sensor models in the loop with perception and control algorithms, to test all aspects of an autonomous vehicle or driver-assist system. This includes the generation of synthetic camera data at the RGB level, synthetic LiDAR point clouds, and synthetic radar data.

To facilitate the development of perception and control algorithms for level 4 autonomy, engineers from MathWorks and Ford Motor Co. developed a shared memory interface between MATLAB, Simulink, and Unreal Engine 4 (a free, open source video game engine) to send information such as vehicle control signals back to the virtual environment.

The shared memory interface conveys arbitrary numerical data, RGB image data, and point cloud data for the simulation of LiDAR sensors. The interface consists of a plugin for Unreal Engine, which contains the necessary read/write functions, and a beta toolbox for MATLAB, capable of reading and writing from the same shared memory locations specified in Unreal Engine, MATLAB, and Simulink.

The LiDAR sensor model was tested by generating point clouds with beam patterns that mimic Velodyne HDL-32E (32 beam) sensors and is demonstrated to run at sufficient frame rates for real-time computations by leveraging the Graphics Processing Unit (GPU).

The engineers successfully established and tested a workflow that provides an interface between a 3D virtual driving environment and vehicle perception systems related to autonomy or active safety. This virtual environment was shown to be capable of generating a synthetic camera and LiDAR data that resembles data from real sensors, and is capable of communicating bidirectionally, via shared memory with algorithms in development.

Ford Model-Based Design Engineer Ashley Micks (SAE Member, 2016) is co-author of an SAE International technical paper, presented at WCX17: World Congress Experience, that presents and in-depth overview of the workflow for the simulation of vehicle perception systems in a 3D driving environment. Micks, who earned a bachelor’s degree in aeronautical and astronautical engineering from Massachusetts Institute of Technology and master’s and doctorate degrees in aeronautical and astronautical engineering from Stanford University, also describes her research in a Ford corporate video.


This article is based on SAE International technical paper 2017-01-0107 by Arvind Jayaraman of MathWorks and Ashley Micks and Ethan Gross of Ford.