Deep neural networks in autonomous driving
March 22nd, 2018 • Gil Golov
With the waterfall of Advanced Driver Assistance Systems (ADAS) moving to lower-end car models, and with a strong consumer take rate, car makers are continuing to innovate and integrate more capable autonomous systems into cars, including:
- Safety features which can reduce human error in driving through automotive features and functions
- Security systems that increase system-level security to thwart malicious hacking
- Deep neural networks which use artificial intelligence to detect and recognize objects as well as—if not better than—humans
- Sensor fusion which combines multiple sensor types to yield better "vision" than humans
- In-cabin cameras to monitor driver status, including drowsy driver detection, to ensure seamless hand-off between car and driver in autonomous situations
- V2X technology which further extends ADAS safety by providing vehicle-to-vehicle, vehicle-to-infrastructure communication in support of advanced traffic hazard communication
Deep neural networks (DNNs) play a critical role in realizing the future of autonomous driving. Implementing object detection and classification via DNNs consists of two phases: training and inference. During training, enormous amounts of labeled data are used to train the system and develop an algorithm that can consistently and accurately detect an object. Once the neural network has been trained, it is then deployed in the field (inference), where the algorithm is used to accurately detect objects in real time including, cars, pedestrians, street signs, bicyclists and more.
For the autonomous driving DNN algorithm to be realized, huge amounts of parameters must be processed to accurately detect surroundings. To understand the magnitude:
- SuperLotto: To correctly guess five numbers (1-47) plus one mega number (1-27) requires 40M combinations
- Complex DNNs: To process approximately 100M parameters (some 32bits) requires 4B^100M combinations
As we move closer to autonomous driving, automotive memory bandwidth also increases in direct correlation to the complexity of the DNN. Today, the automotive industry has already showcased platforms that require more than 1 terabytes (TBs) of memory bandwidth to the computation associated with the highest level in autonomous driving—level 5, which refers to a fully autonomous system. Practical implementations with this level of memory bandwidth can only be realized using GDDR Graphics memories traditionally intended for the graphics markets.
Micron's introduction of automotive-grade GDDR6 memory is the essential puzzle piece that will enable next-generation autonomous driving today. With over 25 years' commitment to the automotive market and as a recognized industry leader in GDDR memory, Micron's GDDR6 products offer high densities, high bandwidth and discrete design to simplify system integration and meet the demands of high-performance automotive applications.
More information:
Jessica B. Cicchino. Effectiveness of forward collision warning and autonomous emergency braking systems in reducing front-to-rear crash rates, Accident Analysis & Prevention (2016). DOI: 10.1016/j.aap.2016.11.009
Provided by Micron