AI Case Study
Uber's self driving car failed to visualise and recognise a pedestrian and the potential risk in real-time resulting in a fatal accident
A self-driving vehicle made by Uber has struck and killed a pedestrian. The accident is a manifestation of a failed case study of Autonomous Vehicles technology, identifying and navigating roads and obstructions and visualising obstacles such as pedestrians in real time.
Industry
Consumer Goods And Services
Automobiles And Parts
Project Overview
Uber’s vehicles are equipped with several different imaging systems which work both ordinary duty (monitoring nearby cars, signs and lane markings) and extraordinary duty like that just described. No less than four different ones should have picked up the victim in this case.
Top-mounted lidar: Using infrared laser pulses that bounce off objects and return to the sensor, lidar can detect static and moving objects in considerable detail, day or night.
The lidar unit, if operating correctly, should have been able to make out the person in question, if they were not totally obscured, while they were still more than a hundred feet away, and passed on their presence to the “brain” that collates the imagery.
Front-mounted radar: Radar, like lidar, sends out a signal and waits for it to bounce back, but it uses radio waves instead of light. This makes it more resistant to interference, since radio can pass through snow and fog, but also lowers its resolution and changes its range profile.
Short and long-range optical cameras: Lidar and radar are great for locating shapes, but they’re no good for reading signs, figuring out what color something is and so on. That’s a job for visible-light cameras with sophisticated computer vision algorithms running in real time on their imagery.
The cameras on the Uber vehicle watch for telltale patterns that indicate braking vehicles (sudden red lights), traffic lights, crossing pedestrians and so on. Especially on the front end of the car, multiple angles and types of camera would be used, so as to get a complete picture of the scene into which the car is driving.
Detecting people is one of the most commonly attempted computer vision problems, and the algorithms that do it have gotten quite good. “Segmenting” an image, as it’s often called, generally also involves identifying things like signs, trees, sidewalks and more." (techcrunch)
NTSB's "report said the vehicle's radar systems observed the pedestrian six seconds before impact but 'the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle'.
At 1.3 seconds before impact, the self-driving system determined emergency braking was needed, but Uber said emergency braking maneuvers were not enabled while the vehicle was under computer control in order to reduce the potential for erratic vehicle behavior." (ndtv)
Reported Results
The failure of all the sensors to identify the pedestrian and thus cause the vehicle to break resulted in the fatal striking of a pedestrian.
Technology
Function
Risk
Audit
Background
"It’s hard to understand how, short of a total system failure, this could happen, when the entire car has essentially been designed around preventing exactly this situation from occurring.
Something unexpectedly entering the vehicle’s path is pretty much the first emergency event that autonomous car engineers look at."
(techcrunch)
Benefits
Data