Radar—Technology from the Last Century Guides Vehicles into the Future

Radar is cheap and effective in driver assistance systems. Now if only it could just be a little clearer.

As a nation, we are understandably worried about the deaths caused by COVID-19. As I write this, the U.S. is hurtling toward 700,000 COVID-related deaths. We can hope for medicine, science and technology to save us, as they have with other pandemics in modernity. But there is a risk of death that we willingly subject ourselves to on a daily basis—a threat that we love, even though it may kill us: the automobile. Some of us may be 10 times more likely to die in an automobile accident than from COVID-19.[i]

Almost 39,000 Americans died in motor vehicle accidents in 2020, the year the pandemic started spreading according to the National Highway Traffic Safety Administration (NHTSA). Over 4 million people injured annually in vehicle accidents require medical attention, too many of them suffering life-changing injuries.

But the problems one technology (automobiles) creates, other technologies will try to fix (safety engineering). Safety technology is now readily available in nearly every new vehicle sold. And what is emerging as the hot new thing in automotive safety technology? It is radar, a technology as old as a World War II bomber, according to the Wall Street Journal.

Radar. Oh, Really?

The earliest technology to make cars safer may have been the low-tech seat belt, which have been required in U.S. automobiles since 1968. The ’60s also featured Ralph Nadar, the thin, black-suited, consumer protection crusader, who vilified the automobile industry, and especially the Chevy Corvair, with his book Unsafe at Any Speed. It was not enough to make Americans stop loving their cars like puppy dogs, but at least they learned that the puppy dog had teeth.

In the modern era, with software and computers, AI has brought us a wealth of technology—and quite a bit of it is being applied to make autonomous vehicles a reality. However, the fully self-driving car, like the flying car, remains just over the horizon. It doesn’t help that cars trying to drive themselves occasionally kill people—an act for which the public has zero tolerance. Or they repeatedly run into emergency vehicles.  While safety advocates have been able to keep the fabled driverless vehicle (full Level 5 autonomy) off the roads, automakers have adopted much of the technology of driverless cars into their new vehicles. Learning from hyped claims of full autonomy that often falls short and generates a backlash from the public, automakers are carefully avoiding expectations of full autonomy—and liability—by referring to the technology they apply as driving assistance.

Volvo, the automaker most often associated with safety technology, suggests a coffee when its systems detect a drowsy driver. (Picture courtesy of Road Show.)

Volvo, the automaker most often associated with safety technology, suggests a coffee when its systems detect a drowsy driver. (Picture courtesy of Road Show.)

The safety technology available in new cars includes rearview cameras, the ability to detect other vehicles in blind spots, sensors that even check to see if the driver is nodding off and, probably the most important technology necessary to keeping us out of the morgue or hospital, automatic braking. (Automakers have promised to make this a standard feature by 2022.)

Sensor Technology

Sensors employed to “see” around a vehicle primarily use these technologies: radar, LiDAR and cameras.

Camera sensors can be mounted all around the car. The rear-facing camera is now common. The most sophisticated camera systems provide a full 360-degree view of the vehicle by rectifying and stitching together views from the cameras on all sides of the vehicle.

On the left of this Mercedes screen is a 360-degree view created from blending multiple images from separate cameras into one. (Picture courtesy of Mercedes.)

On the left of this Mercedes screen is a 360-degree view created from blending multiple images from separate cameras into one. (Picture courtesy of Mercedes.)

Getting a computer to extract what it needs to control the car from a camera image is considerably harder and computer vision, as it is known, is more fallible when it tries to do so. Computer vision must synthesize depth, a third dimension, from a 2D image. Computer vision must extract 3D objects for a system that will attempt to associate different behaviors with that object. For example, a shape identified by a vehicle sensor system as the side view of a human on a sidewalk  may be assessed as a risk to step off the curb and into the path of your vehicle. You, as a driver, may have been reacting to this threat for years, with an update that includes a heightened threat from someone with their head down, absorbed in their smartphone, but the machines are still learning.

Your stereoscopic vision and brain combine to ascertain how far away a pedestrian is. Multiple camera sensors and onboard computers work the same way.

LiDAR

Object detection. (Picture courtesy of HackerEarth.)

Object detection. (Picture courtesy of HackerEarth.)
A LiDAR-equipped system generates millions of data points as it moves. The boxes represent shapes identified by onboard computers powered by machine learning. (Picture courtesy of NVIDIA.)

A LiDAR-equipped system generates millions of data points as it moves. The boxes represent shapes identified by onboard computers powered by machine learning. (Picture courtesy of NVIDIA.)

A more direct route to the 3D information of a vehicle’s surroundings is LiDAR (light detection and ranging).

LiDAR uses lasers—rapidly scanning lasers that bounce back with millions of points, all of them with X, Y and Z coordinates, a veritable point cloud. Distance is immediately calculated. No image processing is necessary for distance calculation. And with millions of dots, the image constructed from connecting the dots has plenty of resolution and can be more easily matched to objects.

But, as you might imagine, laser and mechanisms to rotate them, are not cheap. LiDAR systems, as big as coffee cans, may be fine for the experimental robotaxis tentatively prowling our cities, but since it would add 25 percent to the price of a mass-produced vehicle (LiDAR units are $10,000; the average vehicle retails for $40,000), automakers balk and making it standard equipment and expect few consumers to order it as an option.

LiDAR, being light based, has the same limitations as vison based (camera) systems and you. It can’t “see” through smoke and fog and is bothered by reflection and glare.

Radar

Radar, short for Radio Detection And Ranging, was a new technology used by practically all the militaries in World War II). In the Splendid and the Vile, author Erik Larson chronicles the use of radar by the German bombers and by the British who fought against them. Radar is now commonplace on big jets and ships. A relatively cheap and easily accessible technology, radar has found its way into police equipment (used to detect speeding vehicles). Consumers can buy a radar-based speed detector for less than $100 to see the speed of their serve or their kid’s pitch.

Radar’s main problem in its use as a vehicle sensor is its limited resolution. While it can detect an object in relative motion, the returned image is a fuzzy blip on the screen. The size of the blip gives an idea of the size of the object, but good luck telling a pedestrian from a mailbox. Radar imaging is a technology that employs “tricks” to extract 3D images from radar beam reflections, such as multiple beams, and changes in the frequency of parts in motion, such as propellers. Synthetic aperture radar (SAR) can produce a photographic-like image that’s good enough to see surface features on Earth, like mountains and lakes, to the resolution of meters, from satellites. But with radar frequencies being more than 100,000 times lower than visible light, the image has far worse resolution than cameras or human eyes.

Estimates (in billions) from Grandview Research show an upward march of sensors, camera and radar, and limited use of LiDAR in vehicles. (Picture courtesy of Mouser Electronics.)

Estimates (in billions) from Grandview Research show an upward march of sensors, camera and radar, and limited use of LiDAR in vehicles. (Picture courtesy of Mouser Electronics.)

Toyota was the first carmaker to use radar commercially on its Celsior in 1997, and then radar started appearing in luxury brands like BMW, Jaguar, Lexus and Mercedes, according to IEEE. Radar’s cost relative to LiDAR makes radar a favorite technology for automotive sensor systems. The Mercedes S-Class uses twin radar beams from the top of the vehicle’s windshield to detect a pothole and adjusts the air suspension to dampen the shock to the vehicle and passengers. More accessible cars have side-mounted radar systems to detect vehicles in blind spots, forward beams to detect vehicles stopped ahead so that brakes are applied automatically, and backward beams that trigger an audible alarm if they “see” objects, something that is particularly useful in detecting low objects that are otherwise unseen—for example, trash cans and baby strollers.

Semiconductor giant Intel purchased Mobileye, and although it is founded on vision-based technology (as its name would suggest), automated driving is working on the “next generation of active sensors” that will use LiDAR and radar. Mobileye’s SoC (system on a chip) will have almost a hundred antennas to process radar signals, according to the Wall Street Journal. Using AI to process the noisy signals they receive, Mobileye can reportedly pick out pedestrians.

Traditional commercial radar vs. Oculii’s virtual aperture imaging system. The virtual aperture imaging waveform is adaptive and phase modulated. Each receiver generates a different phase response at a different time, then interpolates and extrapolates data to create a “virtual aperture.” (Picture and caption from Oculii’s website.)

Traditional commercial radar vs. Oculii’s virtual aperture imaging system. The virtual aperture imaging waveform is adaptive and phase modulated. Each receiver generates a different phase response at a different time, then interpolates and extrapolates data to create a “virtual aperture.” (Picture and caption from Oculii’s website.)

General Motors recently invested in Oculii, a startup that is all about implementing radar in vehicles using mathematics (MathWorks) and software algorithms that process and refine data from radar images.

By analyzing the waveform, Oculii’s $50 sensors can “see” clear-enough 3D images, says CEO Steven Hong.





[i] For those ages 1 through 24, COVID-19 has proved to be only about one-tenth as dangerous as cars and trucks normally are, according to data by the Centers for Disease Control and Prevention cited in “How Covid’s Toll Compares With Other Things That Kill Us,” Justin Fox, Bloomberg, March 1, 2021.