Uber’s Self-Driving Car Had 6 Seconds to Respond Before Fatal Crash, But Got Confused, Did Nothing

The car’s automatic emergency braking system was disabled, and the human operator should have intervened, the company says.

The Uber self-driving vehicle picked up an object 6 seconds before impact, then spent about4 seconds trying to figure out what it was and where it was going. By the time it identified the object as a bicycle (not a person walking a bicycle), it had only 1.2 seconds before impact (about 25 meters) to apply the brakes. But since Uber determined that emergency braking would cause “erratic vehicle behavior,” the vehicle did not apply the brakes. This is all contained in the Preliminary Report Highway HWY18MH1010 by the NTSB.

The Uber self-driving vehicle picked up an object 6 seconds before impact, then spent about 4 seconds trying to figure out what it was and where it was going. By the time it identified the object as a bicycle (not a person walking a bicycle), it had only 1.2 seconds before impact (about 25 meters) to apply the brakes. But since Uber determined emergency braking would cause “erratic vehicle behavior,” the vehicle did not apply the brakes. This is all contained in the Preliminary Report Highway HWY18MH1010 by the NTSB.

UPDATE July 2, 2018 — The operator in Uber’s vehicle, Rafaela Velasquez, was watching her cell phone before the fatal accident, according to a 318-page Tempe police report issued June 21. Police had requested her viewing records from YouTube, Hulu and NetFlix. Hulu confirmed that it was streaming The Voice to Velasquez’ account that night. Velasquez had previously stated that, while she had two cell phones, she used a cell phone only after the accident to dial 911 and that she only looked away from the road to watch the Uber system monitor screen. However, police studying 9 videos shot inside the vehicle “found that the driver looked down 204 times with nearly all of them having the same eye placement at the lower center console near her right knee. One hundred sixty-six of these instances of looking down occurred while the vehicle was in motion.” Vasquez was appeared to laugh or smirk during moments when she was looking towards her knee, the report states. During the time the vehicle was in motion (21 minutes, 48 seconds) the operator was attentive to the road for 6 minutes and 47 second (32%). “This crash would not have occurred if Vasquez would have been monitoring the vehicle and roadway conditions and was not distracted,” according to the report which sums up the accident as “entirely unavoidable.”

On March 18, Elaine Hertzberg was walking her bike across Mill Avenue in Tempe, Ariz., when she was struck and killed by a self-driving car. After more than two months, the National Transportation Safety Board (NTSB) released its preliminary report on the accident. Much of what is contained in the four-page report was already known and has been discussed here. The vehicle involved in the accident was a Volvo XC90 outfitted with an Uber autonomous vehicle system. Hertzberg was crossing the road at night, was not in a crosswalk, and was not looking out for traffic. In the vehicle, “operator” Rafaela Vasquez, whose job it was to monitor the self-driving vehicle, was not watching the road, as can be seen in a video
watched over a million times.

Many in the engineering community wondered if the Volvo was even under the control of a self-driving system since there appeared to be no evasive action or braking before the collision occurred.

The NTSB report reveals that the vehicle was indeed operating under the control of the Uber self-driving system and its radar and LiDAR had picked up an object about 6 seconds before impact. That’s plenty of time for the vehicle to stop, according to engineers on Eng-Tips, who calculate a panic stop (0.9G) would have taken only 2.2 seconds. However, as precious seconds were ticking away, the Uber system was thrashing about in confusion. It first identified Hertzberg as an “unknown” object, then as a vehicle, and then finally as a bicycle with “varying expectations of future travel path.” The system never came to properly identify the object as a person walking a bike, but at 1.3 seconds before impact, with the vehicle still barreling down the road at 43 mph toward an unsuspecting pedestrian, it must have given up trying. Whatever the object was, the vehicle must not hit it, and the system “determined
that an emergency braking maneuver was needed to mitigate a collision.”

That realization was useless. In what turns out to be the short report’s biggest bombshell, we read Uber had deliberately chosen not to enable emergency braking maneuvers!

The report shows the X-shapes brick paved walkway that Elaine Hertzberg used before her fateful collision. The NTSB report refers to the walkway as “landscaping.” (Image courtesy of NTSB report.)

The report shows the X-shapes brick paved walkway Elaine Hertzberg used before her fateful collision. The NTSB report refers to the walkway as “landscaping.” (Image courtesy of NTSB report.)

Why No Emergency Brakes? A Smooth Ride

The report says that according to Uber, “emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior.”

Uber may have been afraid of rattling passengers in its future self-driving fleet. False positives can cause a vehicle to hesitate, swerve or jam on the brakes—all of which can register mild to severe alarm in occupants. Uber
appears to have been “tuning” its software to provide the best possible, smoothest passenger experience.

NTSB officials inspect Uber’s Volvo XC-90, damaged from where it struck a pedestrian in a fatal accident. (Image courtesy of the AP.)

NTSB officials inspect Uber’s Volvo XC-90, damaged from where it struck a pedestrian in a fatal accident. (Image courtesy of the AP.)

Self-driving vehicle systems can be tuned from anywhere between an optimum experience for those inside the vehicle to maximum safety for those outside it. Maximum comfort means minimum accelerations, so the least amount of swerving, stopping and starting—definitely not jamming on the brakes. An ideal ride is one where there is slow acceleration and turns; otherwise, there is constant speed and direction, with attention paid only to objects determined
to affect the vehicle. Allowing this kind of ride are onboard near supercomputers stuffed with GPUs, are able to make calculations and decisions that let self-driving vehicles sail through situations that would give human drivers pause.

This was the case with a self-driving Uber Volvo last March that apparently was so certain it was going to make it through the yellow traffic light that it did not slow down—only to end up turned over after hitting another car that suddenly made a turn in front of it. Surprisingly, no one was injured in the incident. The Uber operator was quoted as having “no time to react.” The Uber Volvo was going just under the speed limit (38mph) through the intersection. An Uber representative said the company’s cars are programmed to always pass through a yellow light at their current speed if there is enough time to make it through the intersection.

On the other end of the spectrum, an ultimately safe self-driving vehicle could have you feeling as if you are going to go through the windshield when it fails to distinguish a plastic bag from a flying child. The General Motors self-driving program Cruise is testing Chevrolet Bolts
that may be tuned for maximum safety. But after a ride around San Francisco, Cruise is roundly criticized by Wired for its slow, “herky-jerky” ride that stops at “the whisper of an accident.” During testing, the cars have
two human operators in the front, monitor every move. “We will not launch until we have safety perfect,” GM President Dan Ammann said during a press event.

The Vehicle Operator Is Relied Upon to Intervene and Take Action, Says Uber

In an effort to explain why the Uber system employed no automatic emergency braking during the accident described in the NTSB report, Uber chose to throw its operator under the vehicle. Saying the automated system relies on the operator to intervene
would seem to put the blame of the fatal accident squarely on operator, Rafaela Vasquez, whose checkered past makes her an easy target. Uber had previously defended hiring Vasquez, an ex-felon with a history of traffic violations. Vasquez was not tested for drugs because officers saw no need to do so.

Uber mounts its operator’s console in the center of the vehicle, with the top of the console slightly above the dash of the Volvo XC-90, as shown in this picture from Wired. Below the console is what appears to be the driver’s smartphone with the screen on. Note that the driver is covering the steering wheel with both hands. (Image courtesy of Natalie Behring, Reuters.)

Uber
mounts its operator’s console in the center of the vehicle, with
the top of the console slightly above the dash of the Volvo
XC-90, as shown in this picture from Wired. Note the driver is covering the steering wheel with both hands.
However, below the console is what appears to be a smart phone
with the screen on. It is against Uber’s rules to have a phone
on the “left seat.”(Image courtesy of Natalie Behring, Reuters.)

The NTSB report says Vasquez was looking at a “center of the vehicle” and records her explanation that she “had been monitoring the self-driving system interface.” The video clearly shows her looking down and slightly to her right twice for several seconds each time. She has admitted to having had a personal and business phone in the car with her but
neither was in use before the collision occurred—only after
the collision did she call 911.

A former operator tells CityLab Uber was quite strict about phones in the “left seat.” He was fired for having had one. Uber says it has fired roughly a dozen operators for using
cell phones while in a self-driving vehicle. It is not known if the investigation will attempt to check Vasquez’s phone to corroborate her story.

The concept of having an operator spring immediately into action may be flawed. From Shakespeare’s “The Tempest”: “For now they are oppressed with travel. They will not, nor cannot, use such vigilance when they are fresh.” A WWII study “Breakdown of Vigilance During Prolonged Visual Search,” Royal Air Force cadets were found to be unreliable after only one hour of
closely watching radar screens. Vasquez had driven the car from its garage at 9:14 pm, and then had put the car into autonomous mode at 9:39 pm. So, at the time of the accident at 9:58 pm, she had been letting the car drive itself for only 19 minutes.

Is it possible that despite Uber’s insistence that its operators remain vigilant, the level of autonomy attained by the Uber cars had lulled the operators into a false sense of security? The Uber car was not at Level 4 autonomy (Level 5 being autonomous to the point that the car may not even have a steering wheel), according to Uber’s Jeff Miller in a Automotive News Europe article from November 2017. The NTSB defines Level 3 as limited self-driving, where the vehicle is in full control in some situations, monitors the road and traffic, and will inform the driver when he or she must take control. When the vehicle is in control, the driver need not monitor the vehicle, road or traffic but must be ready to take control when needed.

Uber has not publicly stated what level of autonomy was specifically occurring with the vehicle in question.

While the report says Vasquez took the steering wheel less than 1 second before the time of impact and applied the brakes less than 1 second after impact, it does not explain why there would be a difference of as much as 2 seconds (rounding up) between the operator’s hand and foot actions. The video only shows an absolutely startled Vasquez before the recording stops, but it is unclear if the reaction is a due to an impending collision or because of it. Side by side videos have emerged of the incident, but there’s no guarantee the two feeds (from inside the vehicle and outside the vehicle) have been synchronized.

Uber: The System Is Not Designed to Alert the Operator

In what may be the next bombshell in the NTSB report, we read Uber told investigators that its autonomous driving system is not designed to alert the operator to act.

Uber’s decision to deactivate the vehicle’s automatic braking system and leave it in the hands of the safety operator, who in turn wasn’t given an alert, were mistakes, said Todd Humphreys, associate professor specializing in robotic perception at University of Texas, Austin, in a Wall Street Journal article.

“Uber’s software designers punted the final action to the operator, except no alert was generated,” said Raj Rajkumar, professor at Carnegie Mellon University and founder and seller of autonomous car software startup in the same Wall Street Journal article.

Too Little, Too Late

Had the Uber system been able to apply the vehicle’s brakes 1.3 seconds before impact, when it realized a collision was imminent, the vehicle’s speed would have been reduced from 43 mph to 24 mph, calculates Humphreys. According to the NTSB report, the vehicle was traveling at 39 mph at the time of impact, down 4 mph from its 43 mph speed earlier. This may have been the result of the operator grabbing the wheel (less than 1 second before impact), which could have disengaged the Uber system. There would have been a time lag before Volvo’s City Safety system kicked in, which might account for the vehicle slowing down some—although obviously not enough, given the outcome.


Blaming the Victim?

From the dashcam video, Elaine Hertzberg looks up too late as she tries to walk her bike across Mill Ave. in Tempe, Ariz., before sheis struck and killed by an Uber self-driving car. (Image courtesy of Arizona police.)

From the dashcam video, Elaine Hertzberg looks up too late as she tries to walk her bike across Mill Ave. in Tempe, Ariz., before she is struck and killed by an Uber self-driving car. (Image courtesy of Arizona police.)

The report sets the stage for the blame to fall on the victim and the human operator—the two easiest targets. Uber is allowed to state that stopping the vehicle was up to the operator, not the system it designed and installed. The NTSB report does not implicate the City of Tempe, which may yet be called on to explain how it failed to protect a pedestrian crossing the street using a walkway it had built.

The Antiplanner, a site dedicated to exposing bad urban planning, points to government agencies
who blame the victims rather than attribute fault to their own bad designs.

Indeed, Hertzberg wasn’t looking for traffic, crossing the road where there was no crosswalk. The toxicology report showed her to have marijuana and methamphetamine in her system, which could have made her impaired. The NTSB report states Hertzberg was wearing dark clothing at night (she was wearing a dark jacket or coat, as well as blue denim jeans and white sneakers; she also had white plastic bags hanging off her bike handlebars). Of course, none of that matters to radar and LiDAR, which can detect objects in total darkness.

Hertzberg crossed the median on a X-shaped brick paved walkway, which the NTSB report refers to a “landscaping” feature. Drivers report pedestrian activity on the walkway. A homeless camp is nearby—a place where Hertzberg was popular. There are unsubstantiated claims the walkways were part of a plan to connect developments on either side of Mill Avenue. When the plans fell through, the city propped up signs
not to use the walkways.

Uber stopped testing its self-driving cars in Arizona the day after the fatal accident, but said it was going to restart testing in San Francisco, Toronto and Pittsburgh. The next week, Governor Doug Ducy made it official: Uber’s self-driving cars were banned in Arizona. New CEO, Dara Khosrowshahi, who has been on an apology tour for the accident as well as other notable failings of the company’s previous CEO,
has vowed to return to Arizona once the NTSB investigation is over.