CTA Centennial Part 5c: A Decade of Disruption – Personal Computing
This year marks the 100th anniversary of the founding of the CTA (Consumer Technology Association), which started out as the RMA (Radio Manufacturers Association). This is the fifth in a series of essays exploring and celebrating CTA’s and our industry’s first century of invention, innovation, and entrepreneurship, assembled from varying technology historical research and writings I have done over the course of 20-plus years, including from an annually updated industry history for CTA’s now-defunct Digital America, 20-plus years of CTA Hall of Fame inductee biographies, and numerous tech history articles for a variety of publications over the years.
Here are the previous chapters:
- Part 1: Founding
- Herbert H. Frost: CTA’s “George Washington”
- Part 2: David Sarnoff
- Part 3: The TV Age
- Part 4: The Sixties
- Jack Wayman, Our Industry’s Indispensable Exec
- Part 5a: Decade of Disruption – Home Video
- Part 5b: Decade of Disruption – Personal Audio
While the VHS and Beta camps battled for control of consumer living rooms, a bunch of nerds in the Pacific Northwest embarked on their global domination of every other aspect of our lives. Except, these nerds had no idea that what they were working on would eventually overthrow how we conducted every aspect of our lives since no one, including them, could conceive that any civilian would want or need a computer at home.
Before we delve into these historical PC development details, however, I’d like to acknowledge the importance of the year 1955 to the PC revolution. I was born in 1955, which means nothin’ to no one except me. But 1955 was coincidentally a huge year for the birth of an inordinate number of home computer industry giants. Born in 1955 were Steve Jobs, Bill Gates, World Wide Web creator Tim Berners-Lee, former Google CEO and Apple board member Eric Schmidt, Palm CEO Donna Dubinsky, Cisco cofounder Sandra Lerner, Pac-Man creator Toru Iwatani, Sun Microsystems cofounders Vinod Khosla and Andy von Bechtolsheim, Java “father” James Gosling…and Albert Einstein died.
In all events, while debates raged in the early 1980s about whether we needed a personal computer to work at home, there was no debate about whether we wanted a computer to play at home.
Computer at Play
As I related in Chapter 4, in September 1972, Magnavox introduced the first home videogame console, the Odyssey, which came with six video game cartridges, including Ralph Baer’s original Tennis game.
An Ampex engineer named Nolan Bushnell saw a demo of the Odyssey and Baer’s Tennis game and was impressed enough to quit his job and founded a videogame company with just $250. Bushnell created a coin-operated arcade version of Baer’s paddle-and-ball game he called Pong. He manufactured the arcade consoles through his new company, Atari, a term from the Japanese board game Go, which means “check” – the chess term – in Japanese. Bushnell later would have to pay Magnavox royalties for Pong.
In 1975, Atari launched a home version of Pong, a controller with two paddle control dials that you jacked into your TV. The system was first sold in Sears stores that Christmas under the retailer’s Tele-Games label, then released under its own Atari brand the following year.
Pong’s insanely simple stick figure graphics – a vertical dotted line bisecting the TV screen representing the net, two vertical M-dash lines representing paddles on either side of the screen that you moved up and down via the dials on the console, a square dot as the “ball” that moved in straight lines in any direction after bouncing off the “paddles” and the TV screen borders, and each player’s score at the top – look like a child sketched it with a pencil on a piece of paper by today’s sophisticated video game graphics standards. But in 1976, Pong was not only the height of cool tech but also ended up the center of attention at any social gathering. On the heels of Pong’s success, Atari began to market other games, such as video pinball and a racecar game called Speedway, all of which instigated a tsunami of home video game hardware and game entries.
Videogame chip technology developed rapidly, and devices from both Atari and Magnavox’s Odyssey as well as new entry Coleco and its Telstar system became increasingly more sophisticated in a short period of time. Soon after Pong was released, Bushnell hired two programmers named Steve Jobs and Steve Wozniak to design a game called Breakout. But Bushnell was over-extended and needed cash to keep Atari’s momentum going. In October 1976, Bushnell sold the still relatively new Atari to Warner Communications for an insane $28 million.
In 1977, Atari unveiled the Atari VCS (Video Computer System), better known as the Atari 2600, which, at $200. Instead of playing a single game, the Atari 2600 let you play multiple games that came on plug-in cartridges. The 2600 became an instant sensation and became the prototype for all videogame systems to follow. In the succeeding years, the Atari 2600 and its increasingly sophisticated successors were fed by home versions of popular arcade games such as Pac-Man, Asteroids, Donkey Kong, and Space Invaders. By the end of 1980, Atari accounted for more than a third of Warner Communications’ income and was the fastest-growing company in the history of the U.S.
That same year, Activision, formed by disgruntled Atari game programmers, became the first third-party videogame developer and soon was followed by dozens of others. New videogame hardware platforms, such as Mattel’s Microvision (1979), Intellivision (1980), and Coleco’s ColecoVision (1982) were launched.
By the mid-1980s, video gaming became one of the biggest boom industries in history. By 1983, sales had risen to $3.2 billion ($8.5 billion today) and a quarter of all U.S. households had a video game system. Atari alone had sold 12 million 2600 systems.
Boom-Bust-Boom
But the euphoria didn’t last. Too much product, too much competition, and too many bad games caused the collapse of Atari’s leadership. Like Dominoes, the other game console manufacturers suffered as well. Near the end of 1982, after rushing out an awful videogame version of the movie E.T., Atari plowed thousands of unsold game cartridges into an Alamogordo, New Mexico, landfill. Soon afterward, on December 7, 1982, Atari announced sales of its systems did not meet expectations, and Warner Communications’ stock lost a third of its value.
With Atari’s actual and psychic collapse, the entire videogame followed suit. In 1984, Coleco released the doomed Adam personal computer, eventually bankrupting the company. Mattel discontinued Intellivision. Atari was bought by former Commodore founder Jack Tramiel, who steered Atari into the personal computer industry with disastrous results. With the coming of games such as Zork for the more powerful personal computer and increasing competition for TV usage in the form of the VCR in the early 1980s, the video game balloon burst.
But video games were too compelling to disappear as CB radio had. A new video game boom banged big in 1985 with the introduction of the Nintendo Family Computer (or Famicom), better known as the Nintendo Entertainment System (NES). Nintendo was a 100-year-old Japanese company founded as a playing card manufacturer. But by the early 1980s, now run by the founder’s more forward-looking grandson, Hiroshi Yamauchi, Nintendo was a leading coin-op arcade console maker. Its eight-bit NES, spurred by the popularity of the Super Mario Brothers game and its sequels, raised the stakes for all game system makers. The NES soon was out-selling all its competitors 10-to-1.
The introduction of the NES ignited a series of videogame can-you-top-this products. Each game system manufacturer attempted to create systems and games more graphically spectacular than the one preceding it, creating a cascade of competing innovations that resulted in geometric advancements in gameplay technology seemingly minute-by-minute. Like a championship basketball game, videogame leadership changed hands almost annually with every new system release.
The NES was followed by the Sega Master System (SMS) from a company called Sega. Originally an American coin-operated manufacturer that moved to Japan after World War II, the company merged with a jukebox firm in 1965 and was re-named Sega, a contraction of “service games.”
But Nintendo had superior distribution and NES game developers were restricted from creating games for Sega, dooming Sega’s first entry into the consumer hardware market. By 1987, thanks to Super Mario Brothers and two new games, the Legend of Zelda and Tetris, the NES became the most popular game system in the world.
An Impersonal Computer
Video games were the first computers to get into homes, although consumers likely never considered their fun video game consoles as “computers.”
When someone said “computer” in the late 1970s and early 1980s, the image conjured up was an impersonal machine with ill intentions such as the perfection-seeking Nomad in the 1967 Star Trek episode “The Changeling,” the paranoid murderer HAL 9000 who refused to open the pod bay doors for Dave in 2001: A Space Odyssey (1968), the pre-SkyNet world-conquering computer in Colossus: The Forbin Project (1970), the deadly game playing WOPR computer in WarGames (1983), or unstoppable The Terminator (1984). Even if a computer were useful in some way, why would we want one of these potentially killer devices in our homes? It was bad enough that computers had to be dealt with at the office, assuming it didn’t replace you and leave you jobless.
How the mainstream world viewed the computer had no impact on young computer aficionados. Most tech tinkerers in their teens and early 20s resembled the radio hobbyists post WWI – they simply wanted to see what they could accomplish with this new microchip technology suddenly available and affordable outside huge companies and universities.
This new generation of hobbyists viewed their tinkering as almost purely an academic exercise with a tinge of one-upmanship, with little thought of commercialization, much less riches. Maybe their efforts would get them a job at IBM or HP, as a computer science professor at a prestigious research university like Stanford, maybe with a military contractor, the FBI, or the CIA if their morals and ethics allowed, or even the ultimate computer science employer – NASA.
The founding generation of the personal computer generation was largely inspired by a magazine cover: the January 1975 issue of Popular Electronics, which pictured what is considered the first personal computer, the Altair 8800. Powered by the just-released Intel 8080 chip, only the second 8-bit microprocessor, the Altair was created by an ex-Air Force officer from Georgia, Ed Roberts, and was manufactured by his Albuquerque, N.M., company, MITS.
Altair didn’t resemble anything like what we think of as a PC today – it looked more like a prop from a low-budget sci-film from the 1950s. Altair was a rectangular box the size of a stereo receiver covered with toggle switches and flashing lights. It had no screen – it had to be connected to a display terminal, but you first had to buy what was essentially a video card. Altair lacked a keyboard – you programmed it by flipping the toggle switches. It came assembled, or, if you were a true hobbyist, you could buy the Altair as a less-expensive DIY kit.
Whatever its practicality or drawbacks, the Altair sold better than its inventors expected – thanks in no small part to that Popular Electronics cover story. More importantly, the Altair inspired a fertile hobbyist culture, especially in the San Francisco area in the shadow of Palo Alto and the Center for Computer Research, Stanford, resulting in the creation of several ad hoc organizations such as the Homebrew Computer Club in 1975. It soon seemed that every hobbyist with a garage in and around what soon became known as Silicon Valley was assembling some sort of personal computer.
Not every computer geek was looking to get a job working for someone else, however. The Altair inspired two long-time prep school friends from Seattle, WA, Paul Allen and Bill Gates, to form their own software company they dubbed Micro-Soft. The pair moved to Albuquerque, NM, to write computer programs, including a version of Basic, for the Altair 8800.
On April 1, 1976, two other Homebrew hobbyists, those aforementioned Atari employees Steven Wozniak and Steven Jobs, also formed a computer company. To fund their new entity, the pair raised $1,300 after Jobs sold his VW Microbus and Wozniak sold his HP calculator then begged for credit from local electronics suppliers.
Wozniak and Jobs named their company Apple in honor of Jobs’ work in an apple orchard in Oregon while attending college. At least that’s one story. Apple also could have been named for The Beatles’ record label – Jobs was a big Beatles fan. Or, maybe it was an homage to math genius and Nazi codebreaker Alan Turing – Turing was rumored to have died after biting into a poisoned apple, hence the bite mark in the Apple logo. Supposedly. Or, maybe the name came from a combination of some or all three of these sources, or none.
Anyway, Apple’s first computer was the Apple I, hand-built by Wozniak. Sold for $666 in July 1976, the Apple I essentially consisted of a single motherboard, a video interface but no display, and some ROM into which programs could be loaded. The Apple I sold well, earning Jobs and Wozniak $774,000, which the pair plowed into their next product, the Apple II, introduced in April 1977.
Considered one of the first true mainstream personal computers, the Apple II was comprised of the now familiar set-up of CPU, keyboard, and disk drives, but still needed to be connected to a color TV for a screen. Within a year, Jobs and Wozniak were unable to keep up with the overwhelming demand.
For the first and certainly not for the last time, Apple’s sudden and surprising success sparked a bandwagon. The first Apple II follow-up was the complete Tandy TRS-80 – it included a monochrome CRT display – in August for $599.95 ($3,100 today), followed by the Commodore PET 2001 ($795, $4,1000 today) in October. The PET was unironically named after another 70s fad, the pet rock, but retro-fitted as an acronym, the Personal Electronic Transactor. These three personal computers were dubbed by Byte Magazine as the “1977 Trinity,” and set off a personal PC development race.
In 1980, Gates and Allen fixed the spelling of their company’s name. On July 21, 1981, Microsoft bought an operating system from a small outfit called the Seattle Computer Company and started to convert it into what they called the Microsoft Disc Operating System, or MS-DOS.
Around the same time Microsoft was developing MS-DOS, several companies including Atari, Tandy again, Sinclair, Commodore, and others had each introduced small personal computers. None of these nascent PC entries were very powerful. All required some level of comfort with programming since all instructions had to be typed in, and many ran software programs exclusive only to that PC. While these early PCs enjoyed some level of popularity, they were largely aimed at schools, small businesses, and hobbyists. Mainstream consumers were certainly a desired market, but there was still that nagging question of what practical usage these personal computers would be at home, assuming you could figure out all that coding to make it useful.
Home consumer purchase prospects improved somewhat when several entrepreneurs introduced more sophisticated personal computers that ran more universal software such as MicroPro’s WordStar (1979), which let folks compose documents more easily and more creatively than you could on even an IBM Selectric typewriter and the SuperCalc spreadsheet program that enabled accounting and fancy finance calculations sans paper and an adding machine or calculator.
Charlie Chaplin Sells Computers
But big business wasn’t buying, primarily because these early personal computer makers lacked an important brand imprimatur: IBM.
IBM, number 8 on the Fortune 100 at the time and the top non-oil/non-car maker on the biggest companies list, had dominated the business computing marketplace for decades. These new-fangled personal computers didn’t seem like serious business machines to corporate customers, and they originated from new or unfamiliar companies that lacked a business track record or even a corporate sales staff. The old saying around corporate IT offices was – and continued to be for some time – you never got fired for buying IBM.
At first, IBM ignored what it considered to be annoying PC gnats. Finally, the Armonk, NY, giant realized it needed to get into the personal business or be swamped by the rising tide of these growing number of smaller, more nimble, personal computer companies – and get in quickly because many of its corporate competitors had or were readying their own PCs. IBM’s PC resistance turned to panicked haste, which would eventually come back to haunt the company.
IBM decided it didn’t have the time to develop its own proprietary hardware and software. So, ill-advisedly, the world’s largest computer company used off-the-shelf – and easily replicated – components, including Intel’s new 8088 chip, a plethora of off-the-shelf computing technologies, and, most importantly, the MS-DOS operating system provided on a non-exclusive basis by Microsoft as well as the more established CP/M operating system from Digital Research (which is a whole other story, told here).
Unlike Apple and the other new PC makers, IBM aimed its IBM PC squarely at the market it knew better than anyone: the office.
On August 12, 1981, IBM introduced the IBM PC.
Reams have been written about the birth, rise, spread, domination, and fall of the IBM PC – and how Microsoft brilliantly latched onto and made itself into the giant it is today stemming from its ability to license MS-DOS to any other PC maker – so I won’t spend an inordinate amount of space on this story here. However, here is IBM’s own story on its PC, and here’s a more objective take on the creation of the IBM PC.
Suffice it to say, the IBM PC quickly conquered the office PC market, with ads featuring, for some reason, an ersatz Charlie Chaplin supposedly meant to represent Everyman.
Latching onto IBM’s coattails, a host of other established corporate computer companies such as DEC, NEC, Xerox, Epson, AT&T, and HP quickly produced what came to be known as “IBM clones” since these competitive PCs essentially contained the same components and technologies found inside the IBM PC, including MS-DOS. I myself ran an HP PC user magazine called Personal Computer during this period.
Just as the personal computer revolution started with a magazine cover, another magazine cover validated it. Instead of its usual Man of the Year, the January 3, 1983, cover of Time Magazine featured a personal computer named as its Machine of the Year.
In late 1983, IBM itself tried to capitalize on the market it had created by releasing its IBM PCjr, this time designed for the home market. But the IBM PCjr was poorly designed, too underpowered compared to its office PCs, too expensive compared to those PCs offered by the new crop of home computer makers, and it was equipped with a derided hard-to-type “chicklet” keyboard. Not only was the PCjr IBM’s home computing Edsel, but it left the company unable to compete in the growing home PC market and, finally, by extension, even the office market. IBM had been hoisted on its own “clone” petard.
“Portable” PCs
Some entrepreneurs believed that computers shouldn’t and didn’t need to be desk-bound. A few months before the IBM PC was unveiled, the first “portable” PC was introduced, Dr. Adam Osborne’s Osborne 1, a 24-pound device with a five-inch CRT screen. Only 8,000 were sold in 1981, but sales jumped to 110,000 the following year. At one point, Osborne reported an order backlog of 25 months, but the company declared bankruptcy in September 1983.
The Osborne 1 was followed in July 1982 by the first transportable Compaq and then the Kaypro II “luggable” PC in October 1982, mockingly nicknamed “Darth Vader’s Lunchbox.” Similar luggage-sized machines also enjoyed sales success, including an IBM-compatible transportable manufactured by Compaq in 1986.
Consumers’ only computing alternatives to these desktop or transportable hulks were cheap but still immensely popular computers such as varying Timex Sinclair models, including the original $100 Sinclair 1000, and the $300 Commodore VIC-20, the first PC to sell a million units.
Then there was the Commodore 64, the best-selling PC of its era. Announced at CES in January 1982 and priced at $595, the Commodore 64 was often compared with the Ford Model T. It was marketed as a computer for everyone and was sold in regular retailers such as department stores rather than by specialist electronics or hobbyist retailers. Numbers depend on the source, but we’ll postulate that nearly 20 million Commodore 64 units were sold over its lifetime – it wasn’t discontinued until 1994.
One of the first commercially viable “laptop” PCs was announced in 1985 – the 12-pound Kaypro 2000. This first hinged-top but still bulky “clamshell”-style portable offered a half-VGA-height 640 x 200 pixel monochrome LCD screen and ran MS-DOS. Several evolutionary steps followed, none nearly powerful or flexible enough to double as a home PC or workplace PC replacement, and none that truly would be comfortable enough to prop on your lap.
A race to build smaller and lighter laptops soon ensued.
An Easier Interface
As popular as they had become, especially in the office, PCs required users to memorize and continually type a series of complex commands to operate them. In January 1983, Apple altered this awkward operational equation with the Lisa, named for Steve Jobs’ daughter. Lisa (the computer) deployed what was called a graphical user interface (GUI) like the one demoed by Doug Engelbart at his “Mother of All Demos” in December 1968, and developed at Xerox’s Palo Alto Research Center (PARC).
A GUI graphically mimicked a physical office desktop, complete with pictograph “icons” that designed to graphically represent filing cabinet drawers, manila folders, and paper documents. Instead of navigating your system via a keyboard and arcane typed commands, Engelbart’s mouse was used to “point-and-click” – you freely steered an arrow cursor around the screen to select and “click” on these icons to open programs or files, a far more visual and user-friendly way to conduct computer activities.
While the Lisa was too expensive – $10,000 – and few were purchased, it provided Apple with the experience to develop a more affordable GUI-based PC.
In January 1984, Apple combined all it had learned about point-and-click mouse technology and announced the Apple Macintosh, the first consumer personal computer with a GUI, introduced by the now famously audacious Orwellian SuperBowl ad directed by Hollywood director Ridley Scott. A year later, Microsoft brought a GUI to IBM-compatible machines with the introduction of its oft-derided first version of its Windows operating system, setting up a Mac v. Windows GUI battle that would dominate the PC arena for the next decade-plus.
Both the Mac and Windows GUI operating systems introduced the concept of WYSIWYG – What You See Is What You Get, pronounced WHIZ-ee-wig. What you saw on the screen is what your printer would produce, a startling concept at the time. GUIs and WYSIWYG made creating documents such as invitations, flyers, and posters far easier using new GUI-based software such as Microsoft Word – originally created for the Mac and later ported to Windows. These easier-to-use apps with multiple functions were dubbed “killer apps” and finally began to create a more compelling home PC purchase case.
Personally, the Mac made me a happy corporate camper. I was running a new media R&D skunkworks at a large NYC publishing house at the time – and absolutely HATED the IBM PC clone I was forced to use and hated DOS even more. Having to remember all those C:-prompt commands just drove me nuts. But – I was absolutely forbidden by my corporate masters from acquiring a Mac. I didn’t care. When I got my Fat Mac – one with 512K of memory instead of the original’s skimpy 128K – in the fall of 1984, my co-workers jammed into my small office like the crowd admiring Model’s new sewing machine in “Fiddler on the Roof” to get their first look at it. I wish I’d held onto that piece of PC history.
But I’m getting ahead of myself. In a couple of weeks, I’ll delve into the maturation and some of the consequences of the home video, personal audio, home computer, and other tech revolutions of the late 1970s and early 1980s. But first, we’ll wrap up this innovatively explosive era with a look at the portentous and consequential year of 1984.
See also: CTA Centennial Part 5b: A Decade Of Disruption – Personal Audio Revolution