Throughout 2006, we’ve been celebrating the 30th anniversary of NASA Tech Briefs with special feature articles highlighting the past 30 years of a different technology category each month. This month, we conclude our anniversary coverage by tracing 30 years of computer technology.

We’ve all heard about inventions that will “change the world,” but how many of them actually do? An invention called the personal computer did, in fact, change the world, and continues to shape the way we live, work, and play. Computers, in various forms, have been around for more than 30 years, but it wasn’t until the 1970s that “microcomputers” — desktop sized computers — emerged and changed the computing industry.

Mainframes & Supercomputers

Before the microprocessor, computers were essentially large, room-sized systems that were used by labs, universities, and large companies. In many cases, users never interacted with the computers — they prepared their tasks using punch cards, and the computations were performed in batches, often taking hours or days to complete. These mainframe systems performed bulk data processing, but were not fast enough to perform time-critical calculations, and could not solve complex problems.

A computer architect at Control Data Corp. decided to leave his position in 1972 and start his own company. Seymour Cray envisioned a computer that was faster and more powerful than any other system available — a “supercomputer.” In 1976, Cray Research delivered its first system, the Cray-1, to Los Alamos National Laboratory for a six month trial. The original Cray system featured a famous horseshoe shape, which helped to shorten the circuit paths, and offered vector processing, a fundamental processing method for supercomputing.

NASA's Glenn Research Center in Cleveland, OH, took delivery of an early Cray supercomputer in 1982. The early Crays featured a horseshoe design that enabled the integrated circuits to be closer together. (NASA-GRC)

The Cray-1A was delivered a year later to the National Center for Atmospheric Research (NCAR) at a price of $8.8 million. The supercomputer weighed more than 5 tons, arrived in two refrigerated vans, and required more than 30 workers to move it to the computer room. It had a 12.5-nanosecond clock, 64 vector registers, and 1 million 64-bit words of high-speed memory. It could execute over 80 megaflops. In 1989, after 12 years of service, the Cray- 1A was powered off. In 1996, Cray Research merged with Silicon Graphics Inc. (SGI) and was renamed Cray Inc., continuing to provide high-performance supercomputers.

The Microprocessor

In 1968, Gordon Moore and Robert Noyce left Fairchild Semiconductor to form Intel, which focused initially on memory. Their first product, the 3101 Schottky bipolar random access memory (RAM) was released in 1969. A Japanese calculator manufacturer named Busicom asked Intel to make a chip set they had designed for a desktop calculator. Engineer Ted Hoff was assigned to the task, which used 12 separate chips. Hoff decided that a programmable chip would work better. At Intel, Hoff had access to a new silicon gate process that was capable of higher densities than other semiconductor manufacturing techniques. By combining the calculator application with the new process, he was able to build the world’s first microprocessor. By 1971, the 4004 microprocessor was in production and released to the marketplace.

The microprocessor market soon became a popular one, with AMD becoming a viable competitor to Intel when it was founded in 1969, and Sun Microsystems developing the Sparc chip for its workstations.

Launching a Revolution

In 1974, Micro Instrumentation Telemetry Systems (MITS) introduced a computer “kit” called the Altair, which sold for less than $400. The year before, a Harvard student named Bill Gates developed a version of the programming language BASIC for the Altair. Gates would subsequently leave Harvard to form Microsoft in 1975 with childhood friend Paul Allen. Gates was guided by a belief that the computer would be a valuable tool on every office desktop and in every home.

As the decade continued, other companies followed MITS in the personal computer market, including Tandy Corporation (Radio Shack), which introduced their first model in 1977. The computer featured a keyboard and a cathode-ray display terminal. It also could be programmed, and the user could store information using a cassette tape. In the mid-1970s Commodore International released the Commodore PET (Personal Electronic Transactor), which came completely assembled. Five years later, the company would introduce what would become one of the best-selling computers ever, the Commodore 64, which could be plugged into a television set for video gaming. The Commodore 64 helped revolutionize home entertainment.

Then, in 1976, Steve Jobs and Steve Wozniak decided to form a company to sell a computer kit that was hand-built in Jobs’ parents’ garage. The kit was shown at the Homebrew Computer Club, where Wozniak frequently attended meetings. Eventually, they built 200 computers, which were sold as a motherboard with a CPU, RAM, and basic text and video chips, and users had to assemble the computer. Wozniak sold 50 units to a local computer store, which paid $500 each. The computer was called the Apple I, and Apple Computer was incorporated in 1977.

The Apple II was introduced that year, and featured color graphics and an open architecture, selling for about $1,300. A year later, Apple introduced a new computer called Lisa, the first commercially available computer to use a graphical user interface (GUI), which had been inspired by Xerox Corporation’s Alto computer. The GUI enabled users to interface with their computers by selecting graphical symbols from the display screen instead of typing commands. Lisa, however, had a hefty price tag and a slow operating system, which kept it from becoming a commercial success. Even so, Apple dominated the home computer market through the end of the 1970s.

Then, on August 12, 1981, IBM released the Personal Computer (PC), complete with a brand new operating system from Microsoft called MS-DOS (Microsoft Disk Operating System) 1.0. Both the computer and its operating system would become industry standards and revolutionize computing for mainstream users and consumers.

The IBM PC was based on an open, card-based architecture, which enabled third parties to develop applications for it. It used an Intel 8088 (the first 8-bit microprocessor) CPU, and could accommodate up to 64k on the main board, but up to 640 KB of RAM on cards. With the success of both the PC and the Apple II, the personal computer market soared. Time magazine named the personal computer its “Person (Machine) of the Year” for 1982.

Xerox drew on its original Alto computer and introduced in 1981 the Xerox Star workstation, the first commercial system to incorporate technologies that are common today in personal computers — a bit-mapped display, GUI, icons, folders, Ethernet networking, file servers, and a mouse. The Xerox system again would inspire Apple to create its next system — and its most successful — the Macintosh.

Mac and Windows

The skyrocketing PC market spawned a parade of companies in the early 1980s that offered what were referred to as “IBM clones.” These included Digital Equipment Corp.; Compaq Computer, which was founded in 1982; and Gateway Computer, which debuted in 1985 with its distinctive cow-spotted boxes. Hewlett- Packard also released its first personal computer, the HP-85. Many of these companies were fueled by Intel, AMD, and Motorola continuing to improve upon the microprocessor, including the introduction of the Intel 8086 16-bit microprocessor in 1978.

In 1984, a 19-year-old college student named Michael Dell started a company called PC’s Limited, selling IBM clones. Today, Dell Computer Corporation ranks among the world’s largest computer companies, and pioneered the concept of selling personal computer systems directly to customers, bypassing the dominant system of using resellers to sell mass-produced computers.

The next revolution in personal computing came in 1984 during Super Bowl XVIII. A commercial aired once during the game — and never aired again — featuring a scene from George Orwell’s 1984, which equated the world of “Big Brother” to the IBM PC. It showed that world being destroyed by a new machine. It was the world’s first glimpse of the Apple Macintosh.

The Macintosh — or Mac, as it became known — was the first successful mouse-driven computer with a GUI. The all-in-one system retailed for $2,495 and was introduced with 128 KB of RAM. It eliminated the internal hard drive, using only a single 3.5" floppy drive, which reduced the system’s cost. Even so, the Macintosh was not a top seller until the introduction of desktop publishing in 1985 through Apple’s parternship with Adobe, which produced PageMaker publishing software. Desktop publishing took advantage of the Mac’s graphical capabilities and helped propel Mac sales to new heights.

In 1985, two more technologies helped propel the PC market even higher. Intel introduced the 386, its first 32-bit microprocessor, and Microsoft released the first version of the Windows operating system, Windows 1.0, the company’s own GUI for IBM PCs.

The rest of the 1980s and the 1990s saw a flurry of new releases from most of the computing leaders of the time, but the new buzzword was the “portable” computer. The term portable had been used years earlier to describe Compaq’s portable PC in 1982, and Radio Shack’s TRS-80 Model 100 in 1983. The form factor and weight would continue to decrease into the early 1990s, when IBM introduced the ThinkPad line of portables, and Apple released the Macintosh PowerBook, which established the modern form of the “laptop” computer. It was light, had a longer-life battery than previous systems, and included a built-in pointing device.

Again, the computing giants began to follow suit, releasing new laptop computers, all of which became increasingly smaller throughout the 1990s, helped in part by the advent of the CD-ROM drive, which already was being built into most PCs. In the late 1990s, the Apple iBook was sold for $1,599, a relatively inexpensive cost for a laptop at the time. IBM countered that with the ThinkPad R30, which sold for $929, and featured a 13.3" screen, a 900-MHz Intel Celeron processor, a 10- GB hard drive, and 128 MB of RAM.

But computers would continue to get smaller thanks to the 1992 introduction of a product that had been in development for five years. The Apple Newton was the first PDA, or “personal digital assistant,” a term coined by Apple’s then-CEO John Sculley. Unfortunately, the Newton product was dropped four years later. But in 1996, a new PDA was introduced to the market: the Pilot from Palm Computing. Soon renamed the PalmPilot, the device fit in a shirt pocket, stored thousands of addresses and appointments, and was inexpensive enough to appeal to a mass market. Palm succeeded where Apple did not, selling more than 1 million Pilots in the first 18 months. It outsold cell phones, pagers, and even TVs, becoming the fastest-selling computer product ever.

Computing Today and Tomorrow

Following the development and evolution of the personal computer, we saw the widespread use of new operating systems such as UNIX from Bell Labs/Lucent, and Linux, which was developed by Finnish student Linus Torvalds to be a freely distributed operating system.

This decade has seen the merger of Hewlett-Packard and Compaq, the emergence of carbon nanotubes as a transistor technology, dual-core processing, WiFi, blade servers, iPods, wireless supercomputers, and Macintosh computers with Intel microprocessors. While space prohibits detailed descriptions of the various computer interfaces developed in recent years, such as USB and PCI, or the myriad of peripherals and input devices, they also played an important role in the evolution of personal computing.

Today, we’re playing video games on computer consoles that are faster and more powerful than anything we could have imagined even 20 years ago. The computer age continues, with more inventions sure to come that will “change the world.”