The investigation comes at a time of broader regulatory scrutiny of autonomous driving technologies.
U.S. federal safety regulators are investigating Tesla after a series of accidents where its vehicles crashed into emergency vehicles while the carmaker’s driver assistance system known as Autopilot was engaged.
The National Highway Traffic Safety Administration (NHTSA) claims that the Tesla vehicles collided with parked emergency vehicles that were responding to incidents. The agency has identified 11 crashes that occurred between January 2018 and July 2021, in nine different states. Seven of the incidents resulted in 17 injuries—and one fatality. The incidents happened mostly at night, and the first responder vehicles all had deployed measures such as emergency vehicle lights, flares, road cones and illuminated arrows.
The investigation will cover 765,000 Teslas—that’s all the manufacturer’s models (the Y, S, X and 3) sold in the U.S. since 2014.
“The investigation will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” stated the NHTSA. It will also look at “contributing circumstances” to the incidents as well as similar crashes (presumably to establish a baseline against which to compare the Autopilot’s performance). The NHTSA’s investigation will also focus on the system’s ability to detect and identify objects and events, as well as where traffic regulations allow the Autopilot system to be used.
Depending on the investigation results, Tesla could be hit with nothing but a slap on the wrist—or with a recall order or other enforcement actions.
The vehicles involved in the incidents had either the self-driving Autopilot software or the software’s Traffic Aware Cruise Control function activated. Tesla describes Autopilot as an advanced driver assistance system that, when used properly, reduces the driver’s overall workload.
Autopilot pulls data from the vehicle’s eight external cameras, 12 ultrasonic sensors, radar and powerful onboard computing built on a deep neural network to enhance vehicle safety. The cameras allow the car to see up to 820 feet away, while the ultrasonic sensors are focused on short-range object identification, up to 26 feet away.
Tesla’s Autopilot at work.
The system features traffic-aware cruise control that automatically matches the vehicle’s speed to that of surrounding traffic, and an autosteer feature that assists with steering within a clearly marked driving lane.
Tesla uses the sensors to spot and identify obstacles and to determine how the car should react to them. But according to Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon, the carmaker’s radar has had recurring issues with “false positive” signals; for example, the software would stop the car after identifying overpasses as obstacles.
Tesla had since eliminated the radar, relying instead on the deep learning network’s library of thousands of images used to identify potential obstacles. But the problems have persisted. Rajkumar said that while the improved system can effectively identify most objects that could be seen while driving, it still has plenty of trouble with parked emergency vehicles in its path (and that also applies to trucks perpendicular to its path).
“It can only find patterns that it has been quote-unquote trained on,” said Rajkumar. “Clearly the inputs that the neural network was trained on just do not contain enough images. They’re only as good as the inputs and training. Almost by definition, the training will never be good enough.”
Tesla cautions that Autopilot is intended to be used by a fully attentive driver who is ready to take control of the car at a moment’s notice—and that, “while these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.” The name Autopilot ends up seeming a little misleading; it’s really a robust driver assistance system rather than a real autopilot function.
“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”
The NTHSA has classified five levels of autonomous driving, as seen in this infographic from the organization:
Tesla founder Elon Musk has made ambitious claims about Tesla’s autonomous driving capabilities. “I’m extremely confident that level 5 or essentially complete autonomy will happen and I think will happen very quickly,” he said at the opening of Shanghai’s annual World Artificial Intelligence Conference in 2020. “I remain confident that we will have the basic functionality for level 5 autonomy complete this year.”
In reality, though, the system achieves only a level 2 or 3 status—levels that require the driver’s hands on the wheel even if the vehicle is steering and changing speed on its own. The carmaker’s grandiose claims about its driver-assist technology have drawn the scrutiny of U.S. lawmakers, who argue that those claims are dangerous to drivers. “Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road,” said U.S. Senators Edward J. Markey and Richard Blumenthal to the Federal Trade Commission. “We urge you to open an investigation into potentially deceptive and unfair practices in Tesla’s advertising and marketing of its driving automation systems and take appropriate enforcement action to ensure the safety of all drivers on the road.”
This isn’t the first time Tesla has come under regulatory scrutiny for its autonomous vehicle technologies. In fact, the company was recently investigated by another regulatory agency, the National Transportation Safety Board (NTSB), which found that the Autopilot feature was partly responsible for a fatal accident in Florida in 2018.
The NTSB recommended that Autopilot be limited to areas where it can safely operate, and that Tesla install a better system to ensure that drivers pay attention when it’s activated. The NTSB can’t enforce the recommendations—but agencies such as the NHTSA can. In response to its sister organization’s investigation, NTSB Chair Jennifer L. Homendy said, “As we navigate the emerging world of advanced driving assistance systems, it’s important that NHTSA has insight into what these vehicles can, and cannot, do.”
Musk and Tesla, however, aren’t backing down from the technological challenges of automated driving. At Tesla’s AI Day event on August 19, the carmaker showcased its progress in artificial intelligence, including developments in its self-driving technologies.
The Grand Tour takes the Tesla Model X for a test drive.
It’s clear that, as impressive and sophisticated as autonomous vehicle technology has become, there’s still a lot of work to be done before we reach the point where our cars will drive us to work rather than the other way around. The NHTSA isn’t only looking into Tesla; it has sent investigators to 31 vehicle crashes since 2016 that involved partially automated driving technologies, with Tesla’s accounting for only a quarter of them.
But it’s apparent that some people are abusing the Tesla technology. Along with the incidents that are under investigation, you’ll hear stories of drivers sitting in the back seat, playing on their phones, or driving drunk in a Tesla. “Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves—with potentially deadly consequences,” said the senators. The technology isn’t the only thing that needs to improve before autonomous transportation becomes a reality.
In the meantime, we’ll have to watch as Tesla—and the autonomous driving field the company leads—continues to work out the flaws in its technology under the watchful eye of lawmakers. This is a sign that the autonomous driving field is reaching maturity. With autonomous vehicles no longer a Wild West of technologies, regulators are looking to bring consistency and accountability to the market. And this would be a good thing for consumers, who yearn for the day when their car really does do all the driving for them.
Want to find out more about the evolution of autonomous vehicles? Check out The Road to Driverless Cars: 1925 – 2025.