30 Years of Computer Technology
- Thursday, 04 January 2007
Throughout 2006, we’ve been celebrating the 30th anniversary of NASA Tech Briefs with special feature articles highlighting the past 30 years of a different technology category each month. This month, we conclude our anniversary coverage by tracing 30 years of computer technology.
We’ve all heard about inventions that will “change the world,” but how many of them actually do? An invention called the personal computer did, in fact, change the world, and continues to shape the way we live, work, and play. Computers, in various forms, have been around for more than 30 years, but it wasn’t until the 1970s that “microcomputers” — desktopsized computers — emerged and changed the computing industry.
Mainframes & Supercomputers
Before the microprocessor, computers were essentially large, room-sized systems that were used by labs, universities, and large companies. In many cases, users never interacted with the computers — they prepared their tasks using punch cards, and the computations were performed in batches, often taking hours or days to complete. These mainframe systems performed bulk data processing, but were not fast enough to perform time-critical calculations, and could not solve complex problems.
A computer architect at Control Data Corp. decided to leave his position in 1972 and start his own company. Seymour Cray envisioned a computer that was faster and more powerful than any other system available — a “supercomputer.” In 1976, Cray Research delivered its first system, the Cray-1, to Los Alamos National Laboratory for a sixmonth trial. The original Cray system featured a famous horseshoe shape, which helped to shorten the circuit paths, and offered vector processing, a fundamental processing method for supercomputing.