How hardware fits into your AI journey

Engineers have always expected high performance from their workstations, and that’s good news for AI adoption.

Lenovo has sponsored this post.

(Image: Technology Innovation Institute.)

Artificial intelligence is growing rapidly within the engineering space, and AI functionality is quickly becoming ubiquitous in applications from design to manufacturing and more. The utility and potential of AI means that companies of all sizes need to invest in this technology in order to stay ahead of the curve. Since AI relies heavily on computing power, investment in computing hardware is key. Luckily, the familiar engineering workstations used for CAD, BIM and CAE are evolving to accommodate the computing needs of AI.

Engineering software for CAD, CAE and more are already seeing AI enhancements. In CAD, machine learning can predict designers’ needs and suggest tools or features to use, while generative design tools help streamline the design iteration process. With CAE and simulation, AI models can augment—or one day replace—traditional solvers, accelerating solve time as well as tedious tasks such as data preparation and meshing.


Large language model (LLM) chatbots are also appearing in engineering software of all stripes, offering assistance to users with technical questions or acting as “AI copilots” for tasks like writing G-code or designing schematics. 

Some engineering companies are developing their own AI tools and workflows that leverage their proprietary data to provide results tailored to their needs. But many companies don’t have the budget or hardware to develop their own machine learning models, which can involve billions of parameters and intensive processing and compute power. 

“You should never train a model just once,” says Mike Leach, senior manager for Lenovo workstations. “You need to constantly fine-tune or train to make sure it’s accurate, that it’s up to date and learns as it goes.”

The Lenovo ThinkStation P7, pictured here, is “the world’s fastest and most powerful workstation for AI workloads,” according to Leach. (Image: Lenovo.)

The solution might be a pre-trained model, such as the Llama 3 LLM from Meta or the Falcon open source LLM from the Technology Innovation Institute, which can then be customized.

All of these AI applications need powerful hardware. The GPU is the key, with today’s top-of-the-line GPUs from providers such as NVIDIA and AMD offering dedicated AI processing cores as well as compute power for 3D modeling and rendering. Many engineering workstations can combine multiple GPUs for increased compute power, and offer advantages over cloud resources because desktop hardware can be configured with the latest generation technologies and the fastest processor clock speeds.  

While workstations are critical themselves, part of their core value comes from being part of a larger hardware ecosystem. Leach points to the hybrid AI concept, where workstation hardware, on-premises servers and cloud infrastructure work together to deliver enterprise AI solutions. Lenovo offers a wide portfolio of ThinkSystem servers and ThinkStation workstations that can be deployed as part of an organization’s hybrid AI ecosystem. These AI optimized platforms are certified for NVIDIA’s AI Enterprise software and will enable organizations to develop custom LLMs, GenAI applications, and deploy production AI across their systems.

Though industry is still learning how AI technology will impact the engineering space, it is clear that AI is here to stay, and grow. AI is a computing revolution, and companies that begin to invest in AI-capable hardware now will stand the best chance of success—and with workstations more powerful and versatile than ever before, there are plenty of options to suit every business’ needs.

“AI is a journey, not a destination,” Leach says.

For an in-depth look at hardware for AI in engineering, check out Lenovo’s white paper Workstations for AI in the Modern Engineering Workflow.