In celebration of the 30th Anniversary of NASA Tech Briefs, our features in 2006 highlight a different technology category each month, tracing the past 30 years of the technology, and continuing with a glimpse into the future of where the technology is headed. Along the way, we include insights from industry leaders on the past, present, and future of each technology. This month, we take a look at the past 30 years of Test & Measurement.

As is the case with many of the other industries we’re covering this year in our anniversary features, test and measurement was profoundly affected by the introduction of the personal computer (PC) in the early 1980s. Prior to that, test equipment consisted of large, bulky meters and boxes, as well as mammoth standalone automated test systems cobbled together with a variety of analog instruments and cables.

Keithley Instruments introduced the Model 148 nanovoltmeter (left) in the 1960s; in 2004, the Model 2182A nanovoltmeter was introduced.

The evolution of the PC shares a common path with that of instrument controllers, and according to Dr. James Truchard, president, CEO, and cofounder of National Instruments, the PC proved to be an ideal platform for instrumentation control. With the PC, said Truchard, “There was a tremendous amount of labor savings, we could be more accurate in moving the data from the instrument to the PC, we could compare theoretical results with actual measurements, and we could create archives of the results for later retrieval.”

The additional functionality that PCs provided enabled networked measurements. “This had the effect of shifting the design emphasis away from the front panel only, to a product where the PC interface was also important, and how the instrument cooperated with a PC,” explained Linda Rae, executive vice president and chief operating officer of Keithley Instruments. “Users could now easily move test data back and forth between standalone instruments or between instruments in a rack, something that wasn’t possible before the rise of the PC and improved networking capabilities,” Rae added.

In 1976, a new standard for instrumentation interface emerged. Called GPIB, the technology enabled instrument controllers to communicate with test equipment. “Gradually,” said John Stratton, aerospace/defense program manager for Agilent Technologies, “PCs equipped with plug-in cards for both Ethernet and GPIB began to replace specialty instrument controllers.” Up until that point, he added, “automated test systems were isolated islands of technology.”

Two other trends in the evolution of the PC also affected measurement instrumentation: increased computing power, and lower prices. According to Rae, “The availability of powerful microprocessors, memory chips, and digital signal processors (DSPs), combined with computing power, enabled customers to get more capability at a lower price. These changes continue to be felt down to the present day.”

“In a nutshell,” said Wolfgang Schmittseifer, CEO of Rohde & Schwarz, “the personal computer completely changed test equipment of all kinds. Not only the way test equipment is designed and manufactured, but the way users interact with it, and the number of measurements it can perform have been vastly improved thanks to the PC,” he explained. “This technology enables cost-effective, compact, powerful instruments to be created that simply could not be realized otherwise.”

LabVIEW enables a range of users with minimal or no programming skills to quickly configure measurement icons and virtual test systems.

Virtual Instruments

Over the past 30 years, a change in the science of measurement has happened with the migration from analog to digital instrumentation. According to Martyn Etherington, vice systems were isolated islands of technology.”

Over the past 30 years, a change in the science of measurement has happened with the migration from analog to digital instrumentation. According to Martyn Etherington, vice president of worldwide marketing for Tektronix, “We have seen countless electronic applications migrate from analog to digital implementations — including voice, data, video, RF communications, and oscilloscopes. We have seen digital signaling speeds migrate from slow speeds where signal paths behaved as simple lumped circuits, to faster speeds where the signal paths are transmission lines, to blinding speeds where signal paths are loss-constrained electrical and optical transmission lines.”

“Clearly, microprocessors have become ubiquitous inside measuring instruments, moving instruments from purely analog to primarily digital in nature,” said Brad Byrum, general manager of Yokogawa Corporation of America. That, in turn, he said, has “improved measurement stability, analytical processing power, data presentation, as well as data communication interfaces.”

Said Loofie Gutterman, CEO of Geotest, “The biggest change has probably been the evolution of card modular instrumentation. Thirty years ago, almost all instrumentation was in a box. The desire for more compact instrumentation, coupled with an increasing interest to tie automation to test and measurement functionality, helped fuel the interest in modular instrumentation.”

As new buses, or interfaces, became available for PC-based measurement and control, modular card — or virtual — instruments became more popular. According to Gutterman, “Early-generation products in the late 1980s took advantage of the PC’s ISA bus, with later generation products transitioning to PCI. Today, over 80% of our product revenue can be attributed to products that are based on PC standards — specifically PCI and PXI instrumentation, software, and systems.”

Today, PCs, Web access, USB connectivity, and local area networks (LANs) are commonly used in test applications. “Rather than invent everything ourselves, as we often did in the 1980s, test and measurement manufacturers have learned to leverage the price and volume advantages of computer components, bus structures, and display technologies,” said Stratton. “This shift in thinking lets us focus on what we do best — the measurement science of signal acquisition, generation, and calibration.”

“The PC has clearly established its role as the most important computing element in our society,” stated Truchard. “This power is available for building virtual instrumentation. Along with that, very high-performance analog to digital and digital to analog converter technologies used in communications applications have made it possible to build very powerful measurement capabilities to work with PC technology, and this technology now exceeds that of traditional instrumentation.”

“The Software is the Instrument”

In the early 1980s, Jeff Kodosky, cofounder of National Instruments, was inspired by the spreadsheet, which was created to mirror the way financial analysts worked. He thought that the same type of application could improve the way scientists and engineers performed tests and obtained measurements. In 1986, Kodosky and NI introduced LabVIEW, a graphical development environment for creating virtual test systems.

LabVIEW, according to Truchard, combined the concepts of drawing block diagrams and a user interface, or front panel. “You could quickly build a front panel and program it with block diagrams, so we were able to solve the problem of programming instrumentation in a very unique way,” he said. “We used the mantra, ‘To do for test and measurement what the spreadsheet did for financial analysts.’”

Introduced for the Macintosh platform, LabVIEW initially received few kudos from the test and measurement community. Most scientists were using text-based programming languages such as Basic.

But LabVIEW was successful. “In fact,” said Truchard, “We coined the phrase, ‘The software is the instrument.’”

Other instrument vendors also introduced their own test and programming software, including ATEasy from Geotest, an application development framework for functional test, automated test equipment, data acquisition, process control, and instrumentation systems. It lets users create everything from instrument drivers, to complex test programs.

Software also has played an important role in how test systems are calibrated. According to Schmittseifer, “The ability to calibrate instruments with internal software rather than external hardware has made this task far easier for the user. The user interface has changed from scales and meters to a multifunction, menu-driven display with soft keys, and the measurements themselves have evolved from simple physical tests to complex protocol and signaling measurements.”

Software — including Web-based software — will continue to be a key technology in the future of test and measurement. Said Schmittseifer, “It took years for Ethernet and the World Wide Web to transform the way we work, and LANbased connectivity, together with Webenabled instruments, is transforming the future of test and measurement.” Network connectivity enables test system users and developers to set up, configure, and debug systems much more quickly, he explained.

Added Byrum, “As users gain confidence and comfort using test instruments connected to the Internet, a new generation of distributed testing and/or engineering portals will come into existence.”

And as computing power continues to increase, so will the capabilities of PC-based instruments and the software that controls them. “I believe the PC will still be a key technology driver,” predicted Truchard. For the next 30 years, he said, “I see a convergence of design and test. The increased complexity of devices and change of pace have required design engineers to seek higher levels of abstraction in creating test systems.”