In a preliminary report released by the NTSB on the death earlier this year from a collision between an autonomous Uber vehicle, and a pedestrian crossing the road on a dark street, the analysis has shown conclusively that the failure lies somewhere between three parties, the pedestrian herself, the backup driver who was not paying attention, and the programmers who misjudged the consequence of the intersection of machine right and human failure.
The accident occurred as follows, the pedestrian entered the road at an irregular unilluminated point, dressed in dark clothing, and not looking for oncoming vehicles. The LIDAR in the vehicle noticed an obstruction in the road, but was unable to definitively identify the object, although noted the presence at about six seconds before the impact. At 1.3 seconds before the collision, the vehicle identified that it was a bicycle, and recognized the need to apply the brakes in order to mitigate the collision. The vehicle, travelling at 43MPH would have had a stopping distance of 20 meters, and with 25 meters at this point between the vehicle and pedestrian, had the brakes been activated the collision would have been avoided. Unfortunately, because the stopping would have been sudden, it was considered “emergency braking,” and the programmers had disabled this function, leaving responsibility to the human driver who was allegedly not paying full attention at the time of the collision. The system was also not designed to notify the human driver of the threat of a collision.
Therefore, the responsibility for the collision was shared, and fault should be placed on multiple parties.
The pedestrian, according to the NTSB report:
“was dressed in dark clothing, did not look in the direction of the vehicle until just before impact, and crossed the road in a section not directly illuminated by lighting. The pedestrian was pushing a bicycle that did not have side reflectors and the front and rear reflectors, along with the forward headlamp, were perpendicular to the path of the oncoming vehicle. The pedestrian entered the roadway from a brick median, where signs facing toward the roadway warn pedestrians to use a crosswalk, which is located 360 feet north of the Mill Avenue crash site. The report also notes the pedestrian’s post-accident toxicology test results were positive for methamphetamine and marijuana.”
Regarding the human driver, the report states that, “the inward-facing video shows the vehicle operator glancing down toward the center of the vehicle several times before the crash. In a postcrash interview with NTSB investigators, the vehicle operator stated that she had been monitoring the self-driving system interface.” Notwithstanding this, the human driver also failed to notice or react to the threat of the collision.
Further, there was a severe failure on the part of the Uber designers and programmers to adequately account for human failure when the human drivers are expected to observe and supervise an autonomous vehicle for multiple hours with full attentiveness. The system detected the collision threat within a reasonable time frame, yet failed to act, because according to the report and Uber, “emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”
According to other reports, to summarily provide a smoother riding experience, and perhaps to actually prevent rear-ending from tailgating human drivers, the emergency braking system was disengaged, and this decision actively led to the death of this pedestrian.
This conclusively demonstrates that in a fully autonomous system not constrained by human frailty, this death would never have occurred.
The primary takeaway is that even in this case of a death, which was initially thought to be a case of an apparent collision caused by machine failure, in reality, human failure was the sole contributor to this death.
Staff writer: Ari B