Generative Design Pushes Workstation Performance

The design process requires the right hardware to support higher data and simulation workloads.

The design process requires the right hardware to support higher data and simulation workloads.

The Dell Precision 5490 mobile workstation was introduced earlier this year as one of several new workstations in the Precision line with specific AI optimization technology, including a neural processing unit. Image courtesy of Dell Technologies.


The process known as generative design (GD) has been around for many years, primarily as an early design exploration tool for geometry optimization and aesthetics based on selected constraints. In recent years, its use has grown as vendors added artificial intelligence (AI) capabilities, making the software more powerful.

Today, it is not uncommon for new projects to start with GD even before specifications are set. This is especially true if the product is produced using additive manufacturing (AM), if there is a high aesthetic element, or if the part or product will require extensive CAE work.

Autonomously and Optimally Yours

As CAD vendors recognized the value of GD, senior management did the traditional buy-or-build math. Autodesk’s architecture, engineering and construction (AEC) division bought its way into GD, for example, but then the mechanical CAD unit borrowed and adapted from it. PTC bought the fast-growing GD startup Frustum in 2018, and now includes the technology in its Creo platform.

GD’s roots go back to topology optimization (TO), but the two are not synonymous. TO is used to converge on a single solution using functional objectives. By contrast, GD creates and compares many possible solutions, then sorts through them for the best examples that meet both functional and nonengineering requirements.

As researchers from the University of Florence note, TO is about removing material from the design volume, while GD “maintains the possibility of adding material, and generally, to deviate from the initial [starting shape] provided.”

Some researchers refer to the merging of GD and AI as augmented reasoning, including Frustum founder Jesse Coors-Blankenship, now senior vice president of technology at PTC. He says thinking of GD as augmenting the design process optimizes “designs for multiple objectives simultaneously” while “providing a designer with several novel design alternatives, which enable companies to substantially reduce engineering cycles.”

The software augments the designer’s expertise by optimizing multiple objectives simultaneously, offering up several novel alternatives. AI can also create and evaluate unintuitive constraint options, which an experienced engineer might know but someone with less experience would not consider.

Great Responsibility Requires Great Power

GD is much more computationally intensive than traditional two- or three-dimensional design work. A computer that might be fine running AutoCAD or SOLIDWORKS can slow to a crawl if GD is added. Also, most CAD vendors offer GD features as a separate software module. It may look like part of your basic Creo or Fusion 360 desktop, but behind the scenes it is generating extra work that must be processed either locally or in tandem with cloud computing. Either way, there’s a mountain’s worth of calculating and moving data taking place.

Part of that data is being created by new synergies between the various forms of simulation.

Prith Banerjee, chief technology officer of Ansys, says we are entering a new era of digital engineering.

“Digital engineering is how you engineer future products, which includes hardware, mechanical and electrical parts, as well as software,” he says.

The recent acquisition of Ansys by Synopsys is one example of this trend. Another is the new approach to providing software tools in this artificial intelligence (AI) era. Ansys SimAI “lets you take any of our black-box solvers and train an AI model,” Banerjee notes. “Once trained, it will run a simulation hundreds of times faster.”

NASA uses GD software for a variety of applications. NASA Research Engineer Ryan McClelland shows a structural mount designed using GD, and 3D printed in titanium. Image courtesy of NASA.

“Employing AI and machine learning in CAE not only enables process automation but also accelerates the development of simulation tools accessible to non-experts,” says Jon Peddie, president of Jon Peddie Research. “New business models are emerging to transform product development processes.”

“Every company that wants to remain competitive will have to implement AI in some way, and AI PCs will be central to that,” says Sam Burd, president, client solutions group at Dell Technologies. “From running complex AI workloads on workstations to using day-to-day AI-powered applications on laptops, the AI PC will be an important investment that pays dividends in productivity and paves the way to a smarter, more efficient future.”

There is a spillover effect in adding new technology like AI-enhanced GD to the engineering overhead. More advanced initial solutions drive the need for more advanced simulation of the proposed design.

Zihan Wang, high-tech strategy and operations manager for NVIDIA, notes that as AI speeds up processing, engineers will expand the use of simulation.

“With AI-embedded HPC [high-performance computing], you can process more data points to make a better decision,” says Wang. “If you want to simulate the motion of granular materials in the mining industry, like particles on a conveyor belt in a mixer, double-precision [graphics processing units (GPUs)] are required to speed up the simulation.” says Wang.

Workstation vendors are lining up with new models to address these computational challenges. For example, in addition to announcing several new models in its Precision workstation line, earlier this year Dell announced it is working with NVIDIA to introduce a rack-scale high-density liquid-cooled architecture based on the NVIDIA Grace Blackwell superchip.

Every Component Counts

When considering the purchase of engineering workstations to be used often or primarily for GD, every subsystem will be given additional stress. For example, HP recommends the following as a minimum configuration for any workstation running generative design in either product design or architecture, engineering and construction:

• 64-bit, 6- to 12-core processor running at a minimum of 2.6GHz

• 32 or 64 GB of RAM

• 512GB—2TB NVMe storage

• Mid- to high-end graphics card certified for the application(s).

What follows is a system-by-system overview of what needs to be considered.

Ansys Discovery is a simulation-driven design tool that varies from other CAD products in that it considers generative design and topology optimization to be one and the same in practical use. Image courtesy of Ansys.

Central Processing and Graphics Processing

The CPU and GPU are collaborators in GD. Many generative solutions divide the workload between the CPU and the GPU. In addition, many AI products are now written to take advantage of GPUs. A mainstream GPU that runs typical CAD just fine may not have the capability to run GD at a respectable speed.

Altair and NVIDIA have published benchmark data showing an 8x speed improvement for training physicsAI models when using an NVIDIA RTX A4000 GPU, compared to an eight-core laptop CPU.

Memory and Storage

Datasets can become gigantic when using GD. The workstation must store not just one design in progress, but hundreds of them as the GD algorithms bounce around—so to speak, from one design to the next, looking for the optimal solution. This puts extra stress on both the RAM and storage. A total of 64GB of high-speed RAM provides enough space for typical applications. The more RAM available, the larger the design space GD and simulation software can explore simultaneously, leading to faster convergence on the solution.

Many GD workflows involve handling a large number of small files. NVMe storage (nonvolatile memory express) is more efficient in managing this kind of data than traditional storage solutions such as SATA solid-state drives or hard disk drives.

A software process known as checkpointing is common in AI workflows. The model’s state is periodically saved to allow resuming from that point if needed. NVMe storage offers higher write speeds, enabling faster checkpointing.

Cooling

All this extra algorithmic dancing generates a substantial heat load. A cooling subsystem is a necessity, not a luxury. There are three types of cooling subsystems: air, liquid and hybrid. Liquid cooling offers superior heat dissipation compared to air cooling. Hybrid solutions include self-contained liquid GPU coolers, immersion systems to submerge components in dielectric fluid and direct-to-chip liquid cold plates.

Motherboard Throughput

The GD algorithms work with the CAD algorithms and other software modules. The newest peripheral component interconnect express (PCIe) bus standards are designed for these intensive workloads. It takes years from the release of a PCIe specification to its widespread adoption by workstation vendors. PCIe versions 3 and 4 are currently shipping on new workstations. Version 4 provides up to 16 giga-transfers per second (GT/s) per I/O lane at 32GB/second. The latest specification is PCIe 6, introduced in 2022; the first workstations using this version are expected later this year. Version 6 offers 64 GT/s per lane. New PCIe specifications are always backwards compatible, allowing existing devices to work in workstations using the newer standard.

Visualization Issues

GD is often used in combination with contemporary visualization equipment, including virtual reality or augmented reality. Acquiring the hardware for these visualization solutions is generally not a part of the workstation specification process, but should be considered to make sure there are no bottlenecks from idea to visualization due to a slow subsystem.

More Altair Coverage

Altair Company Profile

More Ansys Coverage

More Autodesk Coverage

How Well Does Your FEA Program Know You?
Algorithm training with proprietary data and natural language support is transforming simulation.
Meet the Latest Star Wars Droid Designers
Droid design contest winners discuss process, inspiration.
Shopping Low-Cost CAD Options
Experts give guidance on sorting out what’s available in the market for robust low-cost (and free) CAD tools for DIYers, students or startup engineering companies.
Making and Breaking Things for Fun
Makers and YouTubers blend engineering, entertainment and creativity.
AU 2024: Project Bernini Exemplifies AI-Powered Design
At its annual user event, Autodesk highlights AI's growing role in products for all sectors, and celebrates being selected as partner for LA28 Olympic Games
Autodesk Company Profile

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Randall  Newton's avatar
Randall Newton

Randall S. Newton is principal analyst at Consilia Vektor, covering engineering technology. He has been part of the computer graphics industry in a variety of roles since 1985.

  Follow DE
#28999