In today’s environment of doing more with less, designers and engineers are constantly pressured to increase productivity, especially in small- and medium-sized businesses where a few people perform the tasks of many. As a result, simulation becomes not only a productivity tool, but also a competitive advantage, and as some of our executives put it, a competitive necessity.

“Simulation is not only a necessity, it is a do-or-die requirement for survival in many industries today,” said Dale Berry, Senior Director of SIMULIA Product Experience Technical for Dassault Systèmes SIMULIA. “What chance does a smaller company have to compete? They can compete well through faster innovation. The key to faster and cheaper innovation is understanding which design alternative to go with, which ones to avoid, how to improve that choice, and how to ensure it will work the first time. All of these questions can be answered through simulation.”

Meeting demands of tight schedules, increasing product variety, and demands for higher quality is impossible without utilizing simulation to speed up the process, according to Jim Rusk, Senior Vice President of Product Engineering Software for Siemens PLM Software. “While these demands may originate with larger organizations, they quickly cascade to smalland medium-sized companies who are part of the value chain. The more complex the product, the more critical it is for the company to invest in simulation technology,” he added.

Through simulation, analysts and designers can establish a smoother communication process as well, said Barry Christenson, Director of Product Management for ANSYS. “Simulation, once it has been properly democratized through customization and automation, is becoming the common language between analysts and designers.” But while small- and medium-sized companies have adopted simulation, not all are getting the full benefit of it, he added. “Those skeptical about the real impact of simulation use it mostly during the last stages of product development. As a result, the recommended changes may be too complex to carry out by deadline.”

A key to adoption for many companies is ease of use. If engineers are faced with steep learning curves, they are less likely to embrace simulation as a productivity tool. Bruce Klimpke, Technical Director for Integrated Engineering Software, believes the complexity of simulation software has decreased to the point that it is now less of a factor. “If you are designing products and are not using simulation tools, chances are your designs are not optimal. Given the decreasing learning curves to do simulation, and the cost of computing, businesses of any size have to do simulation,” he stated.

There are three main reasons to use simulation, according to Svante Littmarck, President and CEO of COMSOL, Inc. First is “decreased costs, because designers can do in one month what would otherwise have taken two or several months, increased revenue because time to market is much shorter, and [third] every now and then, the designer will be able to do something completely new that no one else did before.”

Physical Testing and Simulation

While companies understand that simulation has to be part of their process, it becomes more important to smaller businesses because physical testing costs can be prohibitive. Latestage redesign efforts or product field failures pose significant financial risks to a program and the company, explained Mike Kidder, Senior Vice President of Corporate Marketing for Altair. “Exploring and evaluating design concepts, material choices, what-if scenarios, and methods of manufacture through simulation is far more efficient and cost effective than a ‘make and break’ development process.”

Although reducing “make and break” through simulation has benefits in many industries, the question of completely eliminating physical prototyping and testing raises quality and safety issues. “Simulation will enable dramatically less testing and can give manufacturers the confidence to go to production tooling before a physical test ever takes place,” said Dominic Gallello, President and CEO of MSC Software. “What will happen more and more is that certain aspects of certification will happen by virtual testing, but we are a very long way from the whole plane or the whole car being certified virtually.”

Boris Marovic, Product Marketing Manager for Mentor Graphics Mechanical Analysis Division, agrees. “Simulations can be very accurate, but someone who completely trusts on pure simulation might experience serious problems from effects of the real physics that were not considered in the simulation.”

“While simulation technologies have significantly matured over the last 20 years, it’s hard to imagine a time when performance validation through physical testing will not be necessary,” stated Kidder. “When practical, there will always be the need for a physical test, but the product is going to pass and be on target due to more and more intelligent use of simulation.”

Cost and time considerations are prompting the use of more simulation earlier in the design and development process. So while simulations are usually more cost-effective and timely than physical tests, and can deliver insights not possible with physical tests, a combination of the two is optimal, according to Rusk. “We see a trend towards hybrid approaches that bring simulation and test together to validate product performance. There will always be a role for physical testing in areas where confidence in simulation is low, for validation, or to improve the fidelity of simulations,” he said.

“Besides the cost and time implications of physical prototyping and testing, one of the biggest downsides is that testing often only communicates if a design will pass or fail, not how to make it perform better,” stated Seth Hindman, Senior Manager of Simulation at Autodesk.

Berry agrees that physical testing leaves too many questions unanswered. “Testing can tell you ‘what,’ but it cannot tell you ‘why.’ Simulation can tell you both. Simulation will continue to expand and testing will continue to reduce in scope. But it is not likely that in all cases, testing will be eliminated, nor should we want that.”

The complete elimination of physical testing is hard to predict, but with the increase in digital prototyping, testing is certainly being reduced, although it will most certainly play a part in mission-critical areas, particularly those in which lives are at risk. According to Littmarck, there are two sides of the coin. “One side is modeling and simulation, and the other is physical experiments. One can’t live without the other.” Some model parameters, he explained, cannot be acquired by simulation. “For example, material properties like elasticity, conductivity, and retention rates can only be determined through experimental testing.”

High-Speed Hardware and Other Trends

Among the trends our executives cited for 2014 is the role of high-performance computing, including multicore and parallel computing. According to David Vaughn, Vice President of Marketing for CD-adapco, the advances in software and hardware are proceeding on the same trajectory. “Perhaps I have a skewed perspective, but I believe that parallel computing was invented for engineering simulation. So in terms of technology development, simulation software and parallel hardware development are lockstep,” he said. “What is far more interesting today is the manner in which commercial software vendors apply their licensing model to parallel computing.”

Kidder agrees that consumers continue to be dissatisfied with most software licensing models, as costs have not mirrored the rapid commoditization of hardware. “With the advent of cluster and now cloud technical computing, software and licensing models must adapt to these high-performance computing platforms to fully capitalize on their value to drive product innovation,” he said.

“Cluster and high-performance computing allows most providers the ability to provide parallel processing by tasking local machines or clusters with simulation jobs that free up the computational demand on the user’s desktop,” explained Hindman. “The most conservative, traditional, and expensive approach is to enable multicore processing, but to base licensing on each core engaged.”

Klimpke believes that to be competitive, software vendors have to include multicore computing. “Having internal clusters within an organization is certainly a more complicated proposition. As the tools to manage these environments become more effective, clusters will become more commonplace.”

Simulation has always required significant computing capacity. What has changed is the amount of data the software can produce using high-performance hardware. “Certainly, this is not new, as simulation takes many CPU cycles,” said Gallello. “The trick in simulation is not to take advantage of 10,000 or 20,000 core clusters, but rather how to make the human more efficient at interpreting the results.” Yesterday, he said, “The computer was the bottleneck. Today, it is the engineer who has to interpret all the data.”

Not surprisingly, for 2014, our executives echoed their prediction for this year that simulation will see higher adoption rates and increased usage in more industries. “CAE will become the center of the conversation in manufacturing organizations,” predicted Kidder. “We’ll be seeing it more and more at the C-level, being discussed as an innovation tool, and that performance optimization is a differentiator.”

Design optimization will continue to gain importance. Industry has struggled to work the technology into its mainstream engineering processes, but according to Vaughn, “2014 is the year that design optimization grows a beard. It is the year that the engineering community will realize that design optimization is all grown up and ready to go to work.”

Another trend expected to continue through 2014 is an increase in simulation capabilities offered within traditional CAD programs. “The natural path will be for CAD companies to extend their capacities for analysis and simulation,” said Klimpke. “Of course, there will be a limit to this, as many simulations are very specific.”

Christenson agrees that even as CAD companies extend simulation functionality, one challenge still exists for users who need to do very specific tasks. “There is often the need to work with multiple CAD sources, and even multiple CAD applications for the geometry of the design. Simulation is becoming such a critical and strategic part of the design process that companies simply cannot afford the inefficiencies and risk of having their simulation process change based on the type of CAD model they are working with. They need to have a consistent CAD-neutral approach to performing best-in-class simulation, regardless of the other design process variables.”

Marovic sees CAD companies continuing to expand their simulation offerings, especially in industries such as automotive, and predicts “a strong growth in upfront and CAD-embedded simulation as well as more CAD companies acquiring software companies to extend their offerings in that area.”

This overlap between CAD and simulation will continue to affect both areas of the software world. “As the sophistication of customer challenges has increased, our industry has invested in additional areas of the overall product development solution,” explained Hindman. “One of the most significant changes over the years is that simulation is evolving from being a stage in the process to an important main aspect of what we consider the design process.”

Added Gallello, “If you consider the big three of CAD, PLM, and CAE, CAE is really the last frontier. Making it easier to put balloons on a drawing, getting a fillet just right on a model, or hanging one more artifact off of the design bill of materials is not going to impact a manufacturer as much as better predictability of product and process performance with lower cost of physical testing.”


NASA Tech Briefs Magazine

This article first appeared in the December, 2013 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.