President and CEO
With the birth some 50 years ago of computers based on integrated circuits and semiconductors, engineers had a tool that could potentially produce numerical solutions to differential equations based on the laws of science — equations that realistically modeled the physics at hand, not just a simplified version that modeled the physics in a perfectly ideal case. At that point, numerical analysis became practically very important.
In order to solve large algebraic systems of linear equations and eigenvalue problems with finite precision, using floating point arithmetic rounding errors and cancellation errors had to be managed and kept at reasonable levels. Numerical analysis and numerical methods were needed. Solving differential equations numerically required replacing the differential equation with a corresponding difference equation whose solution converged to the solution of the differential equation; hence “simulating” the differential equation that can’t be solved with a difference equation that can be solved to a certain accuracy.
This started a rapid development of numerical methods and implementation of such solvers in computer programs. Initially, one needed to implement the physics directly in the program code. Deep knowledge of all the bits and pieces in the physics, the equations, the solvers, and the software code were needed to produce useful results. That’s now history, and today’s ultrafast computers, with graphical output, and with software user interfaces that give us access to efficient and well tested solvers let engineers and scientists focus on the physics and skip the programming and numerical analysis parts. Because of this, modeling and simulation has become one of the most important tools for product design and development, and it has resulted in almost all the technology we see around us today.
Despite the rapid development in computer hardware, numerical analysis, and software, we still cannot solve all the differential equations we want to solve. Fluid flow problems represent a class of problems that is particularly demanding in terms of computational power, especially in 3D and time. Moore’s law that predicts the computing capacity to double every 18 months will not hold up much longer; speed of light sets the limit. Fortunately, applied mathematicians have been able to double the speed of calculations by using better methods at the same rate over the past 50 years. Both combined have been able to increase computation capacity a mind-boggling factor of 1016 over the past 50 years!
At this point, software developers are working to make their software packages easier to use, and more importantly, to make it possible for physics experts and engineers to package their simulation models in such a way that non-experts can use them to produce results for their specific purposes, whether it’s a product developer working on optimizing some aspect of his or her design, or a consumer making a purchase decision like how much noise reduction material to invest in to protect a room from unwanted noise. We can look forward to an exciting future in which analysis and simulation will be for everyone!
Read more Executive Perspectives.