How Sensors Empower Autonomous Driving
John Koon posted on January 16, 2019 |
Autonomous vehicle accidents prompt the need for further effort into developing better sensors.

Autonomous vehicles (AVs) have created a great deal of excitement in recent years. It has been touted to reduce both traffic accidents and insurance costs, not to mention the convenience when taking a long road trip. At one point, almost every automaker was testing autonomous driving. Intel, together with BMW, even went on stage in one of Intel’s developers’ conferences proclaiming that robot cars would be commercially available by 2021, though no details were given. Ford made similar claims. 

Then, accidents happened.

The first AV fatal accident occurred in Williston, Fla., in May 2016. A Tesla equipped with multiple sensors and a fully autonomous autopilot feature failed to detect a truck ahead and drove full speed into the truck, killing the Tesla driver. Tesla acknowledged that neither the driver nor the autopilot noticed the “white side of the tractor-trailer against a brightly lit sky.” In March 2018, an Uber SUV in an autonomous mode hit and killed a pedestrian crossing a street in Tempe, Ariz. Uber subsequently halted its self-driving car tests but resumed its program on a limited basis in December 2018.

AVs rely on multiple sensors, including LiDAR, radar, cameras, ultrasound and a deep learning computer, similar to artificial intelligence, to do the job. In the first two incidents, the sensors were supposed to be able to detect the objects and apply the brakes faster and further in advance than any human could. But, they failed. Either the sensors did not detect the objects or the interpretation of the digital data was incorrect, resulting in fatal crashes. Sensors are being developed and improved every day. How good do they have to be for safe fully autonomous driving?

None of the AV accidents have stopped Waymo, a Google subsidiary, from launching bold experiments. In December 2018, Waymo introduced a driverless taxi service in Arizona called the Waymo One. The first to offer a commercial driverless taxi, Waymo is the unquestioned leader in the space, having logged more than 8 million miles of autonomous driving. In this article, we will examine the motivation behind autonomous driving, how the sensors work and how good they have to be to ensure safety.

Autonomous Driving Standards

The primary concern of the U.S. Department of Transportation (DOT) is to increase the overall safety of transportation systems. Working with DOT, the National Highway Traffic Safety Administration (NHTSA) has adopted the five levels of driving automation from SAE International. SAE is a non-profit organization dedicated to automation in driving and aviation. The SAE-defined levels of automation are (1) driver assistance, (2) partial automation, (3) conditional automation, (4) high automation and (5) full automation. All automakers use the same definitions for autonomous driving.

The five levels of automation as defined by SAE. (Image courtesy of SAE.)
The five levels of automation as defined by SAE. (Image courtesy of SAE.)

In 2016, NHTSA released a document that contains voluntary guidelines for Automated Driving Systems (ADS), covering automation levels 3-5 as defined by SAE. Many aspects of automated driving are addressed in the document, including vehicle cybersecurity, human machine interface (HMI), crash behavior, post-crash ADS behavior and consumer education. NHTSA’s vision is that ADS will assist drivers and increase overall safety. This will in turn reduce overall insurance costs.

­Types of Sensors Used in Today’s Autonomous Driving

In order for an AV to see things, it must rely on multiple electronic sensors. The most popular sensors used include LiDAR, radar, camera and ultrasound to sense the vehicle’s surroundings. Most people are familiar with cameras and, to a lesser degree, radar.

The most popular sensors used include LiDAR, radar, camera and ultrasound to sense surroundings.[Note: Can redraw this to include only the four types of sensors]
The most popular sensors used include LiDAR, radar, camera and ultrasound to sense surroundings.

How do LiDAR sensors work? LiDAR stands for light detection and ranging while radar stands for radio detection and ranging. First introduced in the 1960s, a LiDAR unit is usually mounted at the top of the AV and spins around to send out laser beams while the car is moving. LiDAR sends out up to a million laser beams a second to surrounding objects and interprets the signals that are bounced back. LiDAR measures the distance between objects and the LiDAR unit to form a detailed digital 3D map of its surroundings, even from a distance of 100 meters. It is much more useful than the 2D images obtained by cameras.

Many startups have joined the ranks of LiDAR innovation. Luminar, a recent startup, has created a LiDAR with performance better than most. Its system can scan objects at 250 meters with extraordinary detail. Long scanning distance translates into more time for an AV to react, in some cases, up to seven precious seconds sooner. In addition to scanning range, the Luminar LiDAR has the capability to detect a black tire on the freeway at night. This capability is much needed in light of the first fatal accident noted previously, in which the sensors could not detect the difference between the sky and a light-colored truck. In a normal driving situation, a human would slow down if he or she was having difficulty seeing objects due to blinding sunlight.

LiDAR’s primary drawback, however, is its high cost compared with radar. Two years ago, LiDAR units cost about $75,000 each. This technology is still in its infancy. Even though its cost has come down substantially, a LiDAR unit still runs between $5,000 to $10,000 per unit. To be acceptable by the mass market and car manufacturers, it has to be less than $1,000 per unit.

Long used by law enforcement personnel to detect speeding vehicles, radar started to be used in some vehicles in the late 1990s to see other objects, including metal, such as vehicles. Because it uses radio waves, it has some advantages over other technologies, including reliability under conditions of fog, rain and snow. Additionally, it is relatively low in cost. Cameras are a great tool for spotting traffic lights and road and speed signs. It is expected that, with good machine vision, cameras will continue to be used effectively in AVs.

Ultrasound uses much lower frequency sound waves, but above the audible range, to sense surroundings. Though it has a much shorter range detection, about 6 meters, it serves as a redundant system to detect nearby vehicles. This is critical when a robot car attempts to make a lane change on the freeway. Some cars use multiple ultrasound sensors to create a 360-degree view as a driving aid in semi-automatic vehicles.

How Good Do Sensors Have to Be?

Human drivers make decisions based on visual senses, augmented by audio and common sense derived from hours behind the wheel. AVs make decisions based on the digital images derived from sensors that have to accurately detect objects such as:

  • Humans walking around
  • Animals or children jumping out from the sidewalk
  • Nearby vehicles on streets and freeways
  • Objects such as roadblocks, signs, a black tire on the freeway at night, a blue truck against a blue sky or a chair that falls off a pickup truck in front
  • Other items that humans cannot detect easily such as a pot hole from afar or a large crack in the road

Most importantly, it’s possible that the sensors can do a better job in detecting a situation like the collapsing of a bridge above the car due to an earthquake in San Francisco and even make a better decision. In this case speeding up may be a better option than stopping, which is what a human might do.


In this article, the five levels of autonomous driving have been discussed. In addition, the four most common types of sensors including LiDAR, radar, camera and sonic have been explained. LiDAR uses light waves, radar uses radio waves, cameras capture images and ultrasound uses sound waves to detect surrounding objects. NHTSA’s vision is to eventually use machine vision and deep learning, similar to artificial intelligence, to support fully autonomous driving. But, the industry is still in its infancy stage. Sensors have to be better than human drivers at seeing things. It is hoped that new sensor innovations will bring new breakthroughs. Beyond the technology realm, there are other considerations to take into account, such as government regulations, establishing insurance protocols (i.e. when two robot cars collide, determining which robot is at fault), cybersecurity issues and others. In short, there are still many hurdles to overcome, but the rewards are great for both the industry and AV passengers.

Recommended For You