In our annual poll of executives at leading analysis and simulation software companies, we asked about the economic situation’s effect on the market, the pros and cons of virtual prototyping, and how software vendors are helping customers do more with less. Here’s what they had to say about market trends for 2011, and maintaining competitive advantages in a challenging business market.

Doing More With Less

Last year, the executives we polled discussed the challenging economic environment and how their customers were being forced to do more with less, including a reduced staff of analysts and qualified simulation experts. But today, while the economy is still struggling, companies are using more simulation and it is being performed by both experts and designers/engineers.

“Engineers have historically been expert in a specific area of simulation. They would run one case at a time with time available to double-check their work,” said Mike Peery, president and CEO of Tecplot. “Today, simulations are being run more and more by generalists – the engineers who are a direct part of the design process – and they are tasked with doing many things.” As a result, added Peery, the easier the software is to use, the better. “We worked to make our tool easy to learn and apply. After all, specialists appreciate fast learning curves as much as any professional with a heavy workload.”

Jim Spann, vice president of marketing for Blue Ridge Numerics, agrees that experts and “generalists” are collaborating more to produce more and faster simulations. “Experienced analysts are taking advantage of new collaboration and customization tools that allow them to share knowledge, participate in design reviews, and build templates for use by design engineers. This removes a lot of the learning curve for the designer and helps ensure every simulation follows pre-defined guidelines that will yield consistent results.”

This broader community of simulation users also has benefits for managers. “The economic situation has focused organizations on several needs, including not only efficiency, but on their competitive situation,” said Dale Berry, director of technical marketing for Dassault Systèmes SIMULIA. “What we see is not that designers are taking up analysis tasks instead of dedicated analysts. Rather, designers are now being asked to leverage and re-use methods previously developed and validated by expert groups.” As a result, said Berry, “The work is not shifting to designers from analysts, but is expanding as designers discover the value of the experts’ methods.”

“Innovation is enabled by using simulation up-front in conceptual analysis and design, as well as by broadening the user community beyond a core group of specialized analysts,” added Dipankar Choudhury, vice president of corporate product strategy and planning for ANSYS. “As the simulation community expands, there is a continued need for experts who understand deep, comprehensive physics concepts and how to best deploy the simulation tools to a broader base. At the same time,” Choudhury explained, “not every user will require the same level of technology because he or she may not have the need to model complex physics.”

So while the simulation user base is expanding and evolving, so is the software. According to Bruce Klimpke, technical director of Integrated Engineering Software, “There is a trend by senior R&D managers to acquire software that the design engineer can learn quickly. This requires the results of the simulation to match reality with very little room for error. This is especially true in organizations where simulation results are followed by production.” While the current economy is speeding up this trend, said Klimpke, it is the natural evolution of simulation software.

This evolution means there are more diverse users of simulation and analysis software, which should, in turn, help both the software vendors and their customers. “The primary effect of the current economy is on the amount of simulation being done,” said David L. Vaughn, vice president of worldwide marketing for CD-adapco. “It seems everyone has realized that simulation is the key to reducing product development costs by reducing or eliminating physical prototyping and testing.”

“Over the past several years, managers looking to streamline workflows and engineers interested in quicker turnaround for results have moved from dedicated environments for simulation and analysis to their desktop,” said Jon Friedman, aerospace and defense industry manager at The MathWorks. “Engineers visualize and analyze data, develop algorithms, and share results, allowing the industry to adapt to changes in both the economy and market needs.”

Dave Weinberg, president and CEO of NEi Software, continues to see new users of finite element analysis software from a diverse group, including those involved in the earliest stages of product design. “We offer several ways to support engineers who are new to using simulation or analysis tools. Although engineers may have different requirements to ramp up on the solution, they generally seek training and some degree of mentoring.”

Bill Chown, product line director of the system modeling and analysis division at Mentor Graphics, agrees that multi-disciplinary engineering is increasing. “Where organizations traditionally had dedicated simulation departments, today’s designs can’t wait for such departments to turn simulation predictions. The need for concurrent-based simulation by staff engineers is critical.”

Added Svante Littmarck, president and CEO of COMSOL, “I don’t see design engineers replacing dedicated analysts in our market. If the current economy has changed anything, I’d say analysis is more popular than ever before.”

Pros and Cons of Virtual Prototyping

As both simulation software and the computers it runs on become more sophisticated, virtual prototyping also becomes more widespread. Explained Choudhury, “As engineering simulation progresses, we’re seeing the definition of virtual prototyping take on a new and expanded meaning. Where once it meant using engineering simulation to study the behavior of an individual component, it now means testing the entire system—notjustapieceofit—inthe environment in which it will operate to reach an optimal design.” Such an approach, he added, involves more than engineering software — the computer hardware must advance at the same pace.

Chown explained that a virtual prototyping infrastructure, in which models from different domains can be integrated at each stage of the design lifecycle, allows system integration issues to be identified and addressed earlier in the process. “During the verification phase of the design, simulation can again be employed to verify intended system operation. It is a common mistake to completely design a system and then attempt to use simulation to verify whether or not it will work correctly,” Chown said. “Simulation should be considered an integral part of the entire design phase and continue well into the manufacturing phase.”

An advantage to virtual prototyping is that by identifying repetitive simulation methods, organizations can realize substantial return on their simulation software investment. “Unlike machine tools, simulation methods never wear out. So once developed, the ROI continues to grow with time,” said Berry. “That’s the true value of virtual prototyping.”

Added Klimpke, “If you are designing lower-cost products, it may well be easier to just build and test. But you still lose a lot of design insight that only simulation can give you. You can’t see temperature distribution in a part until you simulate it.”

But the benefits of virtual prototyping are tempered by the need for real-world physical prototypes. “While the value of complete virtual prototyping is dependent on the situation, we, as an industry, are approaching a level of capability that will allow a consensus in favor of complete virtual prototyping,” according to Vaughn. “I do want to be cautious in the use of this terminology, because at the same time, I also believe that only in very few instances can physical testing be eliminated.”

“It depends on what you’re doing. Any product that’s either very expensive to manufacture or to operate, and is sensitive to small variations in design, is a great candidate for virtual prototyping,” according to Peery. “Another limitation is accuracy. There are cases where the physics are not well modeled by the equations solved in the simulation.”

“The benefits of virtual prototyping can reduce field product failure risk and test multiple scenarios that reduce the number of prototypes needed to validate designs,” stated Weinberg. However, he added, “Virtual prototyping cannot negate the need for a physical prototype that can reveal design flaws that were not observed in the virtual environment.”

Friedman agrees that organizations must carefully weigh risk and benefit when looking at complete virtual prototyping. “There is no single, hard rule that can be applied. Engineers must always ask themselves what testing can move up front to the virtual world, and where the dimensioning returns for the effort are.” At some point, Friedman cautioned, “a physical prototype is needed to perform final system-level testing, because no one wants to fly in a plane that hasn’t been tested as a physical prototype.”

Trends for 2011

While virtual prototyping continues to be a trend in the analysis and simulation software arena, other important trends also will become prominent in 2011, according to our panel of experts. One of those trends is the growing interest in the use of simulation data management (SDM) or product lifecycle management (PLM) by analysts and general users. This is causing vendors to create new capabilities for SDM and PLM.

For example, with the latest version of their multiphysics software, COMSOL has included database features for the analyst to document and keep track of creations and changes to every piece of the model and simulation, according to Littmarck. “For documentation, production, future enhancements, and maintenance purposes, PLM-type features are necessary in a package for modeling and simulation. Such features shorten time to market, increase quality control, and aid investigation into critical failures.”

Peery explained that Tecplot also has added workflow integration and management features to their products. “We’re very aware of how important this is. We’ve created layout files so work can be easily saved, restored, and archived in their systems, and decisions can be traced back to the original idea.” But PLM systems are still not easy to navigate, added Peery. “There is a very heavy overhead associated with getting data into existing PLM systems. So, there is a lot of work we can do to make it easy and compatible.”

Added Vaughn, “Our observation is that most organizations are not sure yet how, why, or when to include simulation data in their PLM system. There is also growing interest and development of SLM. Our approach has been to listen carefully to our customers in terms of their future requirements and to position our products to be ready if and when the customers require a PLM or SLM interface.”

The most important trend for next year is an increase in the use of simulation, and the number and types of engineers who will be using the software. “Software and embedded systems will continue to grow in impact, importance, complexity, and power through 2011,” stated Friedman.

Littmarck agreed that “analysis tools will become increasingly more powerful, easier to use, and more connected to – and even embedded in – other software systems used by complementary groups of specialists or managers.”

The demand for simulation will also result in software being used in new applications. “Simulation software will continue to find new and specialized applications that will expand the demand from both traditional users and new users in industries such as off-shore, alternative energy, and medical,” predicted Weinberg. “The demand will be supported by the increases in advanced analysis product offerings at every price point.”

Both analysts and generalists having the right tool for the job is an important consideration for vendors. Said Vaughn, “I see a trend that engineers and their management are realizing that, in fact, they need a toolbox, and that one tool does not fit all jobs. My prediction is that what remains of the 1990s trend to reduce software vendors will turn back in favor of encouraging competition and making sure that engineers have the right tool for each job.”

Added Klimpke, “Given the reluctance of business to increase hiring, some designers will have to get involved in parts of designs that were previously not in their realm.” As a result, said Klimpke, the tools they use must be easy to learn and supported well by the vendor.

High-performance computing enhancements also will impact simulation and analysis next year. “As the industry makes strides in making high-performance computing more prevalent and more affordable, organizations will want to leverage engineering simulation tools to find not just a good design, but the optimal design,” explained Choudhury.

“Social networking for simulation, simulation on the cloud, and high-performance computing are all trends – evolving or developing – that vendors will increasingly need to pay attention to,” said Berry.


NASA Tech Briefs Magazine

This article first appeared in the November, 2010 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.