New radar technology for autonomous vehicles will implement Oculii AI and Ansys HFSS.
Ansys recently announced that its high-frequency structure simulator (HFSS) will be used by AI software company Oculii to develop radar for autonomous vehicles (AVs). AV manufacturers have a few choices when deciding how their cars will view the world, with many using a combination of radar, Lidar, ultrasonic sensors and cameras.
According to an Allied Market Research report, the automotive radar market was worth $4.08 billion in 2020 and could reach $10.06 billion by 2028. The report calls out radar’s longer 300-to-500-meter sensing range as one of the reasons it might win out over other sensing methods in the long term. The study calls out adaptive cruise control, autonomous emergency braking and blind spot detection as the three largest opportunities for radar use.
Safety features, the report says, are needed to enhance driver comfort. Not needing to worry about the speed and position of other vehicles can give drivers a less stressful driving experience, and radar can help vehicles deliver that experience with augmented sensing technology. The report also discusses the pandemic and its effects on global automotive demand. Now that customers are hungry for vehicles, some automakers are working to give their offerings more autonomy and loading the cars with radar technology.
What Does Ansys HFSS Do?
Ansys HFSS is “a 3D electromagnetic (EM) simulation software for designing and simulating high-frequency electronic products such as antennas, antenna arrays, RF or microwave components, high-speed interconnects, filters, connectors, IC packages and printed circuit boards.”
The HFSS solvers work on components ranging from passive alarm sensors to larger radar systems. Engineers can see waves generated from an antenna, which helps to understand the magnitude and intensity of the electromagnetic fields in a component.
In 2019, HFSS began to work with Ansys Cloud and was able to solve a full radio frequency integrated circuit (RFIC). Then, Ansys 2021 R1 saw the release of Mesh Fusion. Matt Commens, principal product manager at Ansys said in a webinar that the initial mesh creation is one of the most critical times in the finite element method. Mesh Fusion looks at the local geometry of a component or system and targets where to apply the finest meshes and ultimately where to budget the most computer power.
Looking at the mesh requirements of different components and then deciding on the right mesh technology for the entire system usually means that some parts of a system get optimum mesh parameters while others take a suboptimal arrangement. However, the different meshing technologies available in HFSS can be applied to the appropriate components and then the Mesh Fusion tool does just what its name says and combines them together.
Just as it does for many other fields, Ansys has a robust Autonomous Sensor Development division. Cameras, lidar, radar and perception software testing are the different sensor integration systems where Ansys currently operates. In addition to the HFSS solvers, SPEOS is used for illumination studies and VREXPERIENCE sensors lets users build a virtual sensor array to simulate an experience.
One of the case studies available on the HFSS site shows how Autoliv uses the solvers to design automotive components. The radar sensor mounting bracket in the bumper fascia is studied to optimize the sensor’s position and direction. The bumper’s physical design is already working within constraints of safety, manufacturability, durability and aesthetics. The software allows designers to integrate the geometry requirements of the bumper with the field generating requirements of the sensor through multiple designs and iterations.
Oculii Uses AI to Attack Hardware Problems with a Software Solution
Oculii takes traditional radar systems and gives them an upgrade with artificial intelligence (AI). The company says that one of the consistent problems for radar systems has been the need for more antennas to get better sensing results. This is a hardware problem, it says, but there can be a software solution.
The Virtual Aperture imaging paints two pictures, one using traditional RADAR methods and one using AI. Multiple antenna requirements come from the qualities of radar waves themselves. Radar waves are single frequency, repeating and non-adaptive. Resolution is fixed by the number of antennas, but while that number can increase linearly the infrastructure of cost, power and size will increase exponentially. Building more and more antennas into a system quickly gets to a point where it’s obvious that just one more will not deliver a good cost to benefit ratio.
Moving to AI creates a dynamic waveform that generates different phase responses at any given time. These responses are aggregated into the virtual aperture. The software uses its intelligence to change how much resolution and sensitivity is needed at any given time and adjusts accordingly. This is how the company says it’s using the software solution. By leveraging Moore’s Law, the goal is that over time the software will offer a higher resolution at a longer range and become more cost effective.
Finding a Long and Winding Path to Level 5
SAE’s J3016_202104 standard Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles defines six different levels to frame vehicle autonomy. Level 0 means that a car is completely controlled by the driver and Level 5 sees the car completely controlled autonomously. The automotive industry is currently sitting around Level 2—and arguably Level 3 in some cases—as increasingly advanced driver-assistance systems are added to vehicles. This is an opportunity for Oculii to push their AI into more vehicles.
Automakers are still deciding what to do with the various sensing systems available. General Motors announced an investment of millions of dollars into Oculii. Another agreement with Geely was made to integrate its technology into the Chinese automaker’s vehicles. Oculii is quick to point out that it does not want to manufacture a product for the OEMs, but instead wants to license its software to automakers for their autonomous offerings. Meanwhile, Tesla has discounted the possibility of using radar in its vehicles, for now, due to quality issues.
Oculii and Ansys Together
Oculii used HFSS software to optimize antenna and sensor placement, with an 80 to 90 percent predictive accuracy. Understanding where electromagnetic fields are generated and how strong the fields are creates a stronger tool for Oculii to offer to its customers. The hope is that the AI empowered sensors will soon be able to give more compact and cost-effective solutions to vehicle manufacturers.
“The way to improve radar technology is through software, because software fundamentally improves with data,” said Steven Hong CEO at Oculii. “Ansys HFSS has been invaluable because it allows us to make high-fidelity measurements and predictions around how design optimizations will impact real world performance. Using HFSS, we can close design cycles more quickly, and build confidence in what we do. We have also been impressed by Ansys’ support team—they helped us overcome a number of challenges to get the most out of the solution, which helps us get the most out of radar systems.”
The Oculii and Ansys partnership is interesting because it doesn’t follow the usual Ansys simulation case study. This is not just a company using a simulation model early in the design process to make things more cost-effective, stronger or faster. This is an AI software company taking the HFSS tool and weaving it into its software offerings, almost as though each vehicle equipped with the Virtual Aperture technology will have a little piece of Ansys rolling down the road with it.