In our annual poll of executives at leading analysis and simulation software vendors, we found that high-performance computing is changing the way software is written and sold, and that in a competitive economic market, simulation is not just an advantage, but a necessity.

Cost and time considerations are prompting the use of more simulation earlier in the design and development process. So while simulations are usually more cost-effective and timely than physical tests, and can deliver insights not possible with physical tests, a combination of the two is optimal, according to Rusk. “We see a trend towards hybrid approaches that bring simulation and test together to validate product performance. There will always be a role for physical testing in areas where confidence in simulation is low, for validation, or to improve the fidelity of simulations,” he said.

“Besides the cost and time implications of physical prototyping and testing, one of the biggest downsides is that testing often only communicates if a design will pass or fail, not how to make it perform better,” stated Seth Hindman, Senior Manager of Simulation at Autodesk.

Berry agrees that physical testing leaves too many questions unanswered. “Testing can tell you ‘what,’ but it cannot tell you ‘why.’ Simulation can tell you both. Simulation will continue to expand and testing will continue to reduce in scope. But it is not likely that in all cases, testing will be eliminated, nor should we want that.”

The complete elimination of physical testing is hard to predict, but with the increase in digital prototyping, testing is certainly being reduced, although it will most certainly play a part in mission-critical areas, particularly those in which lives are at risk. According to Littmarck, there are two sides of the coin. “One side is modeling and simulation, and the other is physical experiments. One can’t live without the other.” Some model parameters, he explained, cannot be acquired by simulation. “For example, material properties like elasticity, conductivity, and retention rates can only be determined through experimental testing.”

High-Speed Hardware and Other Trends

Among the trends our executives cited for 2014 is the role of high-performance computing, including multicore and parallel computing. According to David Vaughn, Vice President of Marketing for CD-adapco, the advances in software and hardware are proceeding on the same trajectory. “Perhaps I have a skewed perspective, but I believe that parallel computing was invented for engineering simulation. So in terms of technology development, simulation software and parallel hardware development are lockstep,” he said. “What is far more interesting today is the manner in which commercial software vendors apply their licensing model to parallel computing.”

Kidder agrees that consumers continue to be dissatisfied with most software licensing models, as costs have not mirrored the rapid commoditization of hardware. “With the advent of cluster and now cloud technical computing, software and licensing models must adapt to these high-performance computing platforms to fully capitalize on their value to drive product innovation,” he said.

“Cluster and high-performance computing allows most providers the ability to provide parallel processing by tasking local machines or clusters with simulation jobs that free up the computational demand on the user’s desktop,” explained Hindman. “The most conservative, traditional, and expensive approach is to enable multicore processing, but to base licensing on each core engaged.”

Klimpke believes that to be competitive, software vendors have to include multicore computing. “Having internal clusters within an organization is certainly a more complicated proposition. As the tools to manage these environments become more effective, clusters will become more commonplace.”

Simulation has always required significant computing capacity. What has changed is the amount of data the software can produce using high-performance hardware. “Certainly, this is not new, as simulation takes many CPU cycles,” said Gallello. “The trick in simulation is not to take advantage of 10,000 or 20,000 core clusters, but rather how to make the human more efficient at interpreting the results.” Yesterday, he said, “The computer was the bottleneck. Today, it is the engineer who has to interpret all the data.”

Not surprisingly, for 2014, our executives echoed their prediction for this year that simulation will see higher adoption rates and increased usage in more industries. “CAE will become the center of the conversation in manufacturing organizations,” predicted Kidder. “We’ll be seeing it more and more at the C-level, being discussed as an innovation tool, and that performance optimization is a differentiator.”

Design optimization will continue to gain importance. Industry has struggled to work the technology into its mainstream engineering processes, but according to Vaughn, “2014 is the year that design optimization grows a beard. It is the year that the engineering community will realize that design optimization is all grown up and ready to go to work.”

Another trend expected to continue through 2014 is an increase in simulation capabilities offered within traditional CAD programs. “The natural path will be for CAD companies to extend their capacities for analysis and simulation,” said Klimpke. “Of course, there will be a limit to this, as many simulations are very specific.”

Christenson agrees that even as CAD companies extend simulation functionality, one challenge still exists for users who need to do very specific tasks. “There is often the need to work with multiple CAD sources, and even multiple CAD applications for the geometry of the design. Simulation is becoming such a critical and strategic part of the design process that companies simply cannot afford the inefficiencies and risk of having their simulation process change based on the type of CAD model they are working with. They need to have a consistent CAD-neutral approach to performing best-in-class simulation, regardless of the other design process variables.”

Marovic sees CAD companies continuing to expand their simulation offerings, especially in industries such as automotive, and predicts “a strong growth in upfront and CAD-embedded simulation as well as more CAD companies acquiring software companies to extend their offerings in that area.”

This overlap between CAD and simulation will continue to affect both areas of the software world. “As the sophistication of customer challenges has increased, our industry has invested in additional areas of the overall product development solution,” explained Hindman. “One of the most significant changes over the years is that simulation is evolving from being a stage in the process to an important main aspect of what we consider the design process.”

Added Gallello, “If you consider the big three of CAD, PLM, and CAE, CAE is really the last frontier. Making it easier to put balloons on a drawing, getting a fillet just right on a model, or hanging one more artifact off of the design bill of materials is not going to impact a manufacturer as much as better predictability of product and process performance with lower cost of physical testing.”

« Start Prev 1 2 Next End»

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.