Throughout 2006, we’ve been celebrating the 30th anniversary of NASA Tech Briefs with special feature articles highlighting the past 30 years of a different technology category each month. This month, we conclude our anniversary coverage by tracing 30 years of computer technology.
We’ve all heard about inventions that will “change the world,” but how many of them actually do? An invention called the personal computer did, in fact, change the world, and continues to shape the way we live, work, and play. Computers, in various forms, have been around for more than 30 years, but it wasn’t until the 1970s that “microcomputers” — desktop sized computers — emerged and changed the computing industry.
Mainframes & Supercomputers
Before the microprocessor, computers were essentially large, room-sized systems that were used by labs, universities, and large companies. In many cases, users never interacted with the computers — they prepared their tasks using punch cards, and the computations were performed in batches, often taking hours or days to complete. These mainframe systems performed bulk data processing, but were not fast enough to perform time-critical calculations, and could not solve complex problems.
A computer architect at Control Data Corp. decided to leave his position in 1972 and start his own company. Seymour Cray envisioned a computer that was faster and more powerful than any other system available — a “supercomputer.” In 1976, Cray Research delivered its first system, the Cray-1, to Los Alamos National Laboratory for a six month trial. The original Cray system featured a famous horseshoe shape, which helped to shorten the circuit paths, and offered vector processing, a fundamental processing method for supercomputing.
The Cray-1A was delivered a year later to the National Center for Atmospheric Research (NCAR) at a price of $8.8 million. The supercomputer weighed more than 5 tons, arrived in two refrigerated vans, and required more than 30 workers to move it to the computer room. It had a 12.5-nanosecond clock, 64 vector registers, and 1 million 64-bit words of high-speed memory. It could execute over 80 megaflops. In 1989, after 12 years of service, the Cray- 1A was powered off. In 1996, Cray Research merged with Silicon Graphics Inc. (SGI) and was renamed Cray Inc., continuing to provide high-performance supercomputers.
In 1968, Gordon Moore and Robert Noyce left Fairchild Semiconductor to form Intel, which focused initially on memory. Their first product, the 3101 Schottky bipolar random access memory (RAM) was released in 1969. A Japanese calculator manufacturer named Busicom asked Intel to make a chip set they had designed for a desktop calculator. Engineer Ted Hoff was assigned to the task, which used 12 separate chips. Hoff decided that a programmable chip would work better. At Intel, Hoff had access to a new silicon gate process that was capable of higher densities than other semiconductor manufacturing techniques. By combining the calculator application with the new process, he was able to build the world’s first microprocessor. By 1971, the 4004 microprocessor was in production and released to the marketplace.
The microprocessor market soon became a popular one, with AMD becoming a viable competitor to Intel when it was founded in 1969, and Sun Microsystems developing the Sparc chip for its workstations.