As I approach retirement after 40 years as CEO of National Instruments (NI), I am reminded of the great progress and innovations the test and measurement industry has witnessed over that time. We have gone from an industry driven by vacuum tube technology in the era of General Radio, to a time where the transistor ruled with Hewlett-Packard, to today where software truly is the instrument — a transition that NI helped shepherd.
Moore’s Law has taken us for a wild, fast ride to say the least, and just when you think it has run its course, process innovations extend into new dimensions (literally) and push performance even further. Allow me to reminisce on what the past 40 years have taught me, and where I see this market heading as I shift into the next phase of my career.
When Jeff Kodosky, Bill Nowlin, and I started NI in 1976, we saw tremendous room for innovation in how engineers and scientists interacted with and built test and measurement equipment. We founded the company on the premise that there had to be a better way to serve the test and measurement needs that we, scientists and engineers, could not buy off-the-shelf — that did not mean starting from scratch.
The general-purpose interface bus (GPIB, IEEE-488) was our gateway. Our vision, as stated in 1983, was to “do for test and measurement what the spreadsheet did for financial analysis.” Stated today, the sentence loses some of its power, but think about the early 1980s. At the time, the tools for financial analysis were ‘locked-up’ and too expensive for anyone without a big budget to have at their disposal. The early incarnations of spreadsheets turned this situation on its head, and that is exactly what we wanted to do. We wanted to make it so that any scientist or engineer had the same tools or platform at their disposal as the R&D teams of the leading technology companies. It was a radically empowering view at the time, and in many ways, it still is.
Software is the Instrument
While others might have seen GPIB as a hardware play, we recognized it for what it enabled in terms of software. As the PC industry evolved (as well as Apple’s Mac, which we had a special affinity for, given its graphical user interface), the GPIB cable made it easy to analyze and present data in a customized way. You were no longer confined to the front panel of an instrument, and your pencil and notepad for data acquisition. The opportunity to innovate then shifted to the software world, where programming languages needed instrument drivers for the connected boxes. Our strategy of writing and supporting those drivers offered a critical service, and one that continues today.
But that world still left scientists and engineers with the burden of using tools designed for computer science to perform engineering, test, and measurement tasks. Our answer was twofold: LabWindows™/CVI for engineering-specific tools in ANSI C programming, and LabVIEW, a graphical programming paradigm that took the way we think about solving a problem (in flowcharts and pictures) and turned it into compiled code. The story was simple — Acquire, Analyze, and Present. Do it in software tools designed for your use case that were easy to learn, yet extremely powerful. We coined the phrase, “The software is the instrument,” to describe this approach, and seeing scientists and engineers save valuable time and get to results faster was all the market validation we ever needed.
Evolving with Moore’s Law
People talk about Moore’s Law like it’s about hardware, but computational hardware only exists to run software (and maybe firmware). Once we made test and measurement all about software, we had effectively enlisted Intel, Xilinx, and many other billion-dollar companies into our R&D staff. With customers and partners building proficiency with our software tools, we just had to follow the chips to deliver increasing value to test and embedded systems. This has happened, so far, along two key dimensions: multicore processors and field-programmable gate arrays (FPGAs).
Because LabVIEW is graphical (and therefore, not obviously sequential) it was tailor-made for parallel processing. LabVIEW users were among the first programmers to easily migrate from single-core processors to multiple threads and multiple cores, and see almost instant speed improvements. The pace of change in modern electronics means you can’t waste time doing by hand what a tool can easily do for you.
And that goes to an entirely different level with FPGAs. Some problems are just better solved in the highly parallel, deterministic world of silicon, but the toolchains and programming constructs were inaccessible to most mechanical engineers or medical researchers who were experts in their measurements and problems to solve (not digital design). It became a quest to unlock the power of FPGAs to LabVIEW programmers, and we’ve done that.
When you think about software as uniquely as we have, it’s hard not to think differently than everyone else about hardware as well. Modular, PC-based plug-in boards were a natural byproduct. Make the hardware as lightweight and cost-effective as possible (no dedicated screens, power supplies, fixed-buttons/ knobs, etc.) and put the focus on ADCs, DACs, signal conditioning, and data movement. I have yet to see a T&M vendor design a user interface better than a customer for a specific task that makes them more productive. Even the best front panels on box instruments have you staring at the clutter of unused buttons or menu structures.
We moved as many technical problems into the FPGA as we could, and Moore’s Law (along with Xilinx) delivered us a vehicle capable of handling the computation. We, in turn, pass the keys to that vehicle over to our customers in allowing them to customize that FPGA.
We are seeing glimpses of the future everywhere we look. Look inside a modern factory — there are what we call “cyber physical systems” combining software-centric computing technology with electromechanical systems and human operators to improve safety, efficiency, and cost structures. The acquire, analyze, and present concept is still valid, but now we add “sense, compute, connect” as a parallel flow for Internet of Things (IoT) devices. Wireless technology in general is pervasive — we’ve been saying this a while, but if you aren’t an RF engineer today, you will be. And the more you connect things together, the more you’d be crazy not to take advantage of the data you can collect for billions of sensor nodes. For us, this is Big Analog Data, and it’s the richest set of data in the world.
But even as our capability gets more advanced and the scale of problems we try to solve vaster, the tools we use must get even easier to navigate. Just as machine language migrated to assembly and to object-oriented, other paradigms (including graphical dataflow programming) are critical to offer the right level of abstraction.
No great innovation will be done alone. All of the greatest platforms we use today are great because they’ve fostered an ecosystem. The rise of mobile devices and apps is only possible because of a healthy ecosystem built on developer-friendly platforms. Team-based development, code sharing, and community support will no longer be novel or best-in-class support before long — they will be expected.
It would be impossible to have witnessed what I’ve witnessed in our industry for the past 40 years and not be excited about where all these technologies and trends are leading us. My advice to any new engineer is simple: develop a vision for the future, and pursue it with intensity. And, at the end of the day, don’t be afraid to have fun.
For more information on National Instruments, visit http://info.hotims.com/61068-424.