Start warming up your vocal cords—natural language processing may soon change how engineers interact with design software.
Every engineer dreams of having a virtual personal assistant like Jarvis, the disembodied voice that carries out Iron Man’s orders. Smart assistants like Apple’s Siri and Amazon’s Alexa are a step in the right direction, but they can’t help an engineer design a new car.
Or can they? Recent progress in voice-controlled design software hints that Jarvis may be just around the corner.
Natural Language Processing (NLP) Comes to Design Software
Natural language processing, or NLP, is a subfield of artificial intelligence (AI) that allows computer programs to understand human language as it is spoken and written. NLP has progressed in leaps and bounds since the advent of the internet, and by now most of us take it for granted that our smart assistants can decipher what we say and respond appropriately (most of the time, anyways).
You’ve probably never asked Siri to create a 3D model of an F-16, but an AI research project called CLIP-Forge wants to change that. Developed by researchers at Autodesk’s AI Lab, CLIP-Forge generates 3D models of objects as specified by users in plain language: an F-16, a monster truck, a circular table. For now, the models are rudimentary, made of coarse voxels and lacking texture. But it’s a start.
Another example of NLP enhancing the design experience is NVIDIA’s Project Mellon, an attempt to improve the usability of 3D modeling software through voice control. In a design review, managers and others who may not be intimately familiar with the software can use their voice to tour a model and make changes. For example, while reviewing a sports car, they could command the software to modify the color of the exterior paint, open and close the car doors, or modify the interior—all without having to dig through various menus.
Autodesk is also collaborating with Microsoft to add a natural language interface to Maya for 3D workflows. It will allow users to make changes with natural text input such as “Duplicate bot0 as bot2 and move the copy to its left” or “make torso2 pink,” saving time for beginner and expert users alike.
There are also a variety of third-party plug-ins for different design applications that allow users to execute basic functions with voice control, such as showing/hiding items, drawing basic shapes, zooming in or out, or mirroring objects.
Truly Intuitive 3D Design
There are many ways NLP-enabled systems can make design work more efficient and effective. Natural language input provides designers with a highly intuitive way to interact with complicated software. The ability to use voice commands for common functions helps reduce the clutter of dialog boxes and could help save designers time.
Voice tools can also facilitate collaboration between designers. It’s impractical to share a keyboard and mouse, but two voices could work together on a design much more easily. This can also enable remote collaboration by allowing team members to work on a project together even from different locations. Voice is also an excellent tool for creating annotations on a design that can be recorded and transcribed for collaborators to access. Adding, removing, querying, or updating information in a design database can also be accomplished with voice commands.
Design software can be difficult for beginners to navigate. Voice control lowers the barrier to entry, allowing non-experts to use the software without the need to browse hundreds of icons, memorize menu scripts, or switch between various command panels.
The Future of Voice-Controlled Design Software
Unlike computer languages, which are highly structured and precise, natural language is inherently ambiguous and can be interpreted in many ways depending on context. This makes it difficult to design software that can understand and respond correctly to natural language commands. To overcome this challenge, NLP algorithms need to be highly sophisticated and consider various factors such as context and user intent.
But NLP is getting better by the day. While early examples apply voice to basic commands and rudimentary models, improvements to the technology may see much more complex instructions available via natural language. For example, image using your voice to control generative design: “Jarvis, make this bracket 15 percent lighter but make sure it can withstand a load of 500 pounds.” It will be like talking with your very own personal design assistant.
It is already of immense benefit having voice control integrated into design software. While there are many frontiers that still need to be conquered, as they say in natural language, the sky is the limit.