Manufacturers like BMW are seeing big benefits from GPU-accelerated workstations.
Manufacturers are finding new ways to generate synthetic data and test robotic solutions with a combination of AI-guided analysis and unstructured data collected from recordings like video. The data, hardware and software have the potential to do more than assist in automated environments such as automotive assembly lines. They may also be a path to enhance human interaction with robots. The information and tools could visualize scenarios for the use of wearable robotics technology to amplify human strength and endurance.
NVIDIA’s recent collaboration with BMW showcases how GPU-accelerated workstations, such as Dell Precision workstations powered by NVIDIA’s RTX technology, are being used to model, train and test robots for the factory floor, shaving off an enormous amount of time between concept and production.
BMW is in the process of building its newest plant, a 400-hectare full EV vehicle production facility in Debrecen, Hungary. This plant will contain a press shop, bodyshop, paintshop and assembly. According to a BMW blog post, it should be completed by 2025 and produce 150,000 vehicles a year. Much of the work will be performed by robots.
“With Omniverse, we’re making factory planning like an e-sport. Officers from the BMW Group’s 31 existing factories around the world can join together to walk through the model of the new factory, in the same 3D model. They meet together in real time to visualize and discuss various use cases, optimize planning and conduct collision checks,” Mike Geyer, Product Manager of Omniverse Manufacturing for NVIDIA, said to engineering.com.
Focal points include improving the passive motion of complex equipment, like the transfer of a car from a sling to a conveyor belt. BMW is also determining how human workers at the plant can maximize efficiency. For example, finding the best spot to locate lineside material will allow operations to run smoothly and safely.
Customizing Metaverse Tools Improves Outcomes
A manufacturer can ensure platforms like Omniverse meet their needs by building custom applications and extensions. The purpose of the customized software is to allow Omniverse to create a more accurate and detailed digital version of the manufacturer’s facilities and operations. BMW has taken this initiative by collaborating with NVIDIA on modifications that improve the primary use case and process flow on the factory floor. Better simulations developed with NVIDIA Isaac Sim, a robotics simulation and synthetic data generation toolkit within the Omniverse platform, allow organizations to train models faster.
One of BMW’s custom Omniverse applications, Factory Explorer, is based on Omniverse’s Universal Screen Description (USD) Composer. USD Composer is a customizable foundation application in Omniverse. BMW used core components of USD Composer and added extensions tailored to the needs of the factory planning team. These included finding, construction, navigating and analyzing factory data. BMW is also making modifications to Omniverse to improve space planning layouts, integrate supplier data, heighten worker safety and map point clouds.
Over the next 10 years, NVIDIA expects manufacturers across all industries to invest trillions of dollars in new factories—and that Omniverse will help that money go further.
“Building a new factory is a three-to-five-year process. Omniverse allows companies like BMW Group to optimize different components in factories and the overall sites before construction,” says Geyer.
It is a challenge to make sure that the factory’s design is flexible enough to incorporate emerging technologies. Typically, the biggest driver of changes to an automotive production facility is engineering changes to the vehicle. BMW meets regularly to discuss how altering vehicle designs could affect material flow at Debrecen. Team members typically ask questions such as whether it is necessary to add a robot to the floor to perform an additional task.
BMW can also use Omniverse to explore the likelihood of collisions between robots, or between robots and people. It also determines how adding more machinery could impact cycle time for vehicle production.
How AI Changes the Roles of Engineers
The advancement of AI software helps a wide variety of engineers by allowing them to concentrate on defining the problem space. This is a higher-level task than solving each problem that a factory faces. With AI, the role of a manufacturing engineer who designs and operates a manufacturing system will increasingly come to involve setting up the constraints and creating prompts.
For example, many tools now used in automotive manufacturing are fixed tools. These are permanently sited at designated spots. Manufacturing engineers currently have to figure out how to improve operations with tools that do not move.
With AI, manufacturing engineers can assign this work to the algorithm. The engineer starts by defining constraints for operation of the fixed tool, such as scope, time, cost and risk. The manufacturing engineer uses their company’s desired parameters for each factor. They then determine which data to input into the AI algorithm, such as video of virtual interactions of fixed tools in a plant. With repeated requests, the algorithm can train itself.
All of these actions free up the engineer’s time to design a flexible tool. This machinery will later take the place of the fixed tool. When the engineer has perfected the flexible tool, they can input it into Omniverse. This will allow them to see how the flexible tool may alter cycle time and safety needs.
The improvement of machine vision, coupled with AI, will further accelerate and optimize factory planning, development and operations. Right now, machine vision assists automotive manufacturers with tasks such as quality and defect inspection and product identification. With better machine vision, jigs could perform tasks such as grabbing a piece of sheet metal and moving it into exactly the right position. This will allow the factory to produce different models on the same line.
Machine vision needs more refinement, both in real life and in simulations developed with platforms like Omniverse. Current technology can tell the difference between a coffee mug and a soup bowl. It can visually inspect the alignment between the chassis or frame of a car, and the body, the outer shell of the car, at the point of insertion. Machine vision cannot easily spot changes that involve geometry.
For example, say a body panel has been dented, rendering it defective. Machine vision is not able to spot the defect and determine whether the panel is usable.
“What we want to do with machine vision is model thousands of permutations of 3D models. We want to be able to see many changes in shapes, not just work with basic image recognition,” says Geyer.
Improvements in machine vision should also assist electronics manufacturers. With heightened machine vision, Omniverse could help enterprises to simulate contact-rich assembly processes like threading a bolt on a nut or grasping small pieces for pick and place. Then the AI algorithms could find ways for robots to improve their performance. This would allow companies to reduce worker stress and fatigue and speed time to market.
Incorporating Lift-Assist Devices and Cobots
Going forward, BMW is looking at using Omniverse to visualize how lift-assist devices and collaborative robots (cobots) will change automotive production. This will require engineers to input human-related data into Omniverse. Examples include parameters to ensure ergonomic norms are met and accommodations of devices for different body types. Grasping how human-related technologies will alter factory operations is expected to require an enormous amount of evaluation.
“Giving the workers ‘super powers,’ more strength, more endurance, means engineers will have to program the AI algorithm in a multitude of new ways. For example, if a worker tries to lift more than they can handle while wearing a lift-assist device, how should a robot interacting with them respond?” says Geyer.
One of the advantages of Omniverse is that it enables simulations that should give companies and national governments confidence that conditions at a factory will be safe for workers. Each country is likely to have unique compliance rules. Visual simulations that incorporate country-specific regulations can help a company share its vision and progress.
As automotive manufacturers, particularly in the field of EV production, complete new factories, they have an incentive to move toward more open communication and collaboration. Cooperation can discourage “twin drift,” the phenomenon of virtual models beginning to differentiate from real world scenarios.
“It’s important for companies that work at the edge like the factory floor to share insights internally and externally. Creating an enriched, immersive dashboard has the power to take predictive maintenance to the next level,” says Geyer.
As an example, say automotive manufacturers in a certain area noted a power spike and shared how they were affected. They could then determine what machines were damaged and work together to get replacements quickly. Geyer says all manufacturers have a common need, which is to determine how AI can help them.
“If a company does not yet know how to apply generative or predictive AI, that’s where its peers could share ideas about potential. The common ground is the hunger to do things differently, using more automation and intelligence. These companies know that now is the time to make that change,” says Geyer.