When a lot of us hear the word “gooey,” we think about sticky buns or creamy sugary fillings (yum). Others think “GUI”, as in “Graphical User Interface.” A GUI is what computer types call the system of icons, taskbars, and other objects that our computers use to display and access information. A few of us may wonder how the GUI came to be. We remember the halcyon days of DOS prompts and command line interactions; some of us then take an aspirin and lie down. Others continue to wonder how exactly we got from esoteric UNIX, CP/M, and DOS commands on green screens to playing with pretty pictures and colorful desktops.
This article tells how the GUI came about. It starts with the (possibly apocryphal) story of how Cro-Magnon Glug accidentally developed the GUI (along with the personal computer). It then takes us through the better-documented days of Ivan Sutherland’s Sketchpad, Xerox’s PARC lab, Alan Kay’s Smalltalk, and the (possibly even more apocryphal) stories of the rivalry between Jobs’ Apple and Bill Gates’ Microsoft that gave us the Windows and Mac GUI-driven OSs of today. Along the way we’ll learn about the memex, the first wooden mouse, “bit-blitting,” the Xerox Star, the Apple Lisa, and what really happened that momentous day in the PARC labs when Steve Jobs and company paid a visit, notepads in hand…
Indeed, some people think the GUI was bestowed upon an eager mankind by Steve Jobs, simply because Apple’s Macintosh line of computers was the first place most of us ever encountered a graphical interface. It was for me. I first worked with a GUI around 1984 when my computer friend proudly introduced me to his brand new Mac, complete with a neato video game involving a guy digging holes to trap predatory spiders and a totally cool program called MacPaint. I remember to this day creating a beautiful work of MacPaint art that another friend promptly dubbed “Mondrian Waffles.” Too bad I didn’t know how to print it out.
But today I work almost completely inside a GUI environment; even while typing this document, I click an iconized button to create a font effect or save the document. Every time I surf the Web, I make use of other people’s GUI creations. I click on the buttons that take me to new places; I admire the graphics that someone created; I snort at the poorly designed GUI structure. You do the same things, whether on a Mac, a Windows PC, or something else. It’s a GUI world after all. But is it a gift from Steve Jobs? Or is the story more complex?
Let’s find out. Oh, and feel free to enjoy a sticky bun while you read. It seems somehow appropriate
Editor’s Note: This article is fairly long. If you’d like to read a printed version, simply click on the ‘Print This Article’ link at the top of this page.
The Origins of the GUI
“What I saw in the Xerox PARC technology was the caveman interface, you point and you grunt. A massive winding down, regressing away from language, in order to address the technological nervousness of the user.”–- an IBM technician lambasting the Apple Lisa’s GUI
Once upon a time, way back in the Stone Age, lived two cavemen, Ugh and Glug. Ugh was a handsome, sportsy, outdoorsy type, with a stunning physique and the mental capacity of a waterbug. His pal Glug was just the opposite: nervous, toothy, skinny (except for the little pot belly he’d earned from eating too many delivery pizzas), and smart. Damn smart. One day, while Ugh was out hunting game and posturing for the caveladies, Glug was sitting morosely in his cave, absently scratching under his loincloth and telling himself that he was not jealous of that lout Ugh. Directly over his head, a chunk of rock trembled and shook as it pulled free from the ceiling. It snapped free and dropped directly onto Glug’s pointy head with a thunk.
When Glug awoke, it was to the annoying feeling of water being sluiced into his face, and the equally annoying scent of Ugh’s aftershave – Mammoth Musk. Ugh was bent over him, concern in his perfect blue eyes. “Glug okay? Rock whack Glug.”
Glug shoved his friend out of the way and said, “Give Glug room! Must think!” Glug dashed from the cave and into the grassy meadow beyond the cave entrance. Panting, he snatched up shards of coconut shells, wads of tall grass, smooth flat rocks, and several mammoth bones left over from breakfast. He lugged his haul inside the cave and, almost as an afterthought, picked up the rock that had knocked him unconscious earlier. Ugh looked wonderingly at his friend, then went outside to stare at bugs (he was endlessly fascinated). With a pointy stick, Glug began to draw strange diagrams in the dirt. Ugh peeked inside hours later and, seeing his friend engrossed in his odd cave drawings, shrugged and went to the third cave over to spend the evening with his ladyfriend Oohlala.
Days passed. Ugh quickly learned to stay out of the cave, as Glug snarled and threw bits of rock and coconut shell at him if he entered. Besides, it was more fun staying with Oohlala. On the fourth day, Glug came outside for the first time. He squinted at the harsh sunlight, then motioned for Ugh to come inside. Ugh shoved Oohlala off of his lap and followed his friend inside the cave.
In the corner, hidden by a bearskin, was a squarish object. Dull light seemed to come from underneath the animal skin. Wonderingly, Ugh approached the object, ready to bolt at the first odd noise or threatening motion. Glug said, “Ugh behold!” and whipped the bearskin off of the object.
Whatever it was, it was built of coconut husks and pieces of mammoth bone, with a large hollowed-out coconut shell sitting atop the large, flattish hunk of rock that had brained Glug earlier. Hanks of grasses were twisted together and shoved into the backs of the various objects. A flat rectangular rock covered in tiny river barnacles sat in front of and slightly below the main body of the object, connected to the main housing with a braid of grasses. Dark light poured from the large, centrally placed coconut shell. Ugh stared in fearful awe at Glug’s creation.
“Look, Ugh!” Glug exclaimed. “Glug build wonder thing!” As Ugh gaped, Glug hunched over the flat, barnacle-studded rock and began poking it, apparently at random. But…lines, tiny bright green ones, appeared in the dark light of the coconut shell. Apparently the more Glug poked at the barnacle-covered flat rock, the more the green lines appeared in the shell.
After a few moments, Ugh began to grow impatient. Oohlala was waiting outside, after all. “What good this thing?” he asked. “Not good to eat. No good for wearing in snow. Maybe good for whacking mammoth….”
“No, no, no!” Glug snarled. “Can do things. Count mammoths Ugh kill. Watch.” He began to poke the barnacle board in short, sharp jabs, and one by one, lines appeared in the shell. “One…two…here one you kill last month…here one that you chase over cliff…here one that washed up in river…” He kept poking, and finally he stopped. “See? Ugh kill seventeen mammoths since first thaw. Good to know, uh?”
Ugh was interested, despite himself. “Can Ugh use?” Glug motioned him to sit in front of the shell and placed Ugh’s dirty fingers on the barnacles. “Okay, Ugh pay attention. First, must log in.”
“Log in?”
“Log in…we create Ugh a user name. We call it…Ugh.” Over Ugh’s shoulder, Glug poked at the barnacles, to Ugh’s mounting confusion. “Now Ugh can get in system. Good. Now we go to C: prompt….”
“See prompt? Where? Ugh confused.”
“Ugh wait. Good, we in. Now we access file directory.”
“File directory? User name? See prompt?” Ugh shoved himself away from the strange object and stood up, towering angrily over Glug. “Glug confuse Ugh! Make head hurt! You want Ugh to use strange thing, give Ugh pretty pictures! Bright colors! Point and click, not barnacles and see prompts! Ugh leaving!”
Ugh shoved Glug aside and made his way to the cave entrance and outside. He failed to notice that he had shoved Glug too hard, and poor Glug went reeling headfirst into the cave wall. For the second time in a week, Glug’s skull impacted solid rock.
It was dark outside when Glug finally awoke. Ugh was nowhere in sight. Glug sat up, cradling his injured head in his hands. “Whoof, head hurt big time,” he muttered to himself. “But Glug have better idea!” He found his pointed stick and began drawing in the dirt again. “Him want bright colors, pretty pictures, Ugh, him get bright colors, pretty pictures. Glug call it…Graphical User Interface. But Glug not tell Ugh that. Him hardheaded as cave bear, fancy name scare Ugh. What me call it?” Glug thought and thought. Finally he decided to go for the simple, straightforward approach. “Ugh make baby talk OK,” Glug thought. “Just stick with baby noises. Glug call it…GUI. Gooey. Even Ugh smart enough to say that one. Gooooooeeeeeey.”
“There is no reason anyone would want a computer in their home.”
— Ken Olson, President, Chairman and Founder of Digital Equipment Corp., 1977
The story is part of the computer world’s mythology. December 1979, an ordinary afternoon: young computer whiz and entrepreneur Steve Jobs leads a band of his homeys into the rarefied enclave of Xerox’s Palo Alto Research Center (PARC). Jobs and friends tour the plant with wide-eyed admiration, doing their best Norman Rockwell gee-whiz kids-in-the-dugout impression (“Golly, Dr. Kay, can we have your autograph, huh?”), while behind guileless eyes and black-framed glasses, mental notes are being taken and schematics memorized.
Jobs leads his friends out of the building, waves bye-bye to the nice lab geeks inside, and dashes back to his shabby warehouse, where he and cohort Steve Wozniak stuff every idea and process they can remember from the Xerox tour into their new product, the Macintosh. Xerox is befuddled, Microsoft’s Bill Gates is enraged, and Apple gets the jump on everyone with a new dance craze, the GUI. “Do the GUI” sweeps the computer world and everybody else scrambles to get on the gravy train. Gates takes Jobs’ thievery one step beyond Jobs’ own and brings out Apple-clone Windows, Microsoft does a pas de deux with the local judiciary to dodge an Apple lawsuit, Windows takes over the world, and Apple is relegated to cult status among the renegade hackers and Mac addicts of the computer industry.
Nice story to read your kids to sleep with. It has everything: drama, criminal behavior, ruthless rivalry between former associates, everything except sex (which the stereotyped computer geeks are unfamiliar with, anyway). Hell, it would even make a good David Allen Coe drinking song. The only problem with it is that it isn’t true.
The Real History of the GUI
The real history of the Graphical User Interface is more complex and interwoven than the simplistic “It Takes a Thief” conception.
“So we went to Atari and said, ‘Hey, we’ve got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we’ll give it to you. We just want to do it. Pay our salary, we’ll come work for you.’ And they said, ‘No.’ So then we went to Hewlett-Packard, and they said, ‘Hey, we don’t need you. You haven’t got through college yet.'”
— Apple Computer Inc. founder Steve Jobs on attempts to get Atari and HP interested in his and Steve Wozniak’s personal computer
From Small Seeds…
Apple was founded by Steve Jobs and Steve Wozniak in Jobs’s garage in 1976. Jobs and Wozniak met at Hewlett-Packard and began their collaborative careers by building (Wozniak) and selling (Jobs) “blue boxes” illegal devices that scammed free phone calls from Ma Bell. Both shared an interest in the “primitive” computers of the time and enjoyed cobbling together electronic goodies with solder and breadboards. Eventually they decided to start a company and build computers that wouldn’t take up an entire basement, didn’t need supercooling, and didn’t require platoons of guys in jumpsuits to take care of them. In other words, they envisioned building personal computers for the masses. Of course, neither Jobs nor Wozniak were the first to think of personal desktop-sized computers (common wisdom gives that honor to the MITS “Altair,” a 1975 kit-based creation running Microsoft’s BASIC OS and based on Intel’s 8080 chip), but that’s another story. They put their heads together and decided to call their company Apple.
In March 1976, Wozniak built the first Apple, the Apple I. It was a cobbled-together curiosity made of circuit boards and LED displays stuffed into a wooden box, but it stirred enough interest in the computing community to inspire Jobs and Wozniak to found Apple on April Fools’ Day, 1976 to sell their little beasties. Jobs sold his VW minibus, and Wozniak his HP scientific calculator, to finance the startup. They only managed to sell about 200 of the Apple I’s, so the fledgling company – now consisting of Wozniak, Jobs, and a few friends/employees – used the money they managed to raise from Apple I sales to start work on the Apple II (Wozniak has reputedly said that a large part of his desire to build the Apple II was due to “Breakout,” a classic video game he had designed for Atari. Wozniak wanted to program it for a PC). In 1977 the Apple II debuted, featuring a sleek plastic case (as opposed to the “orange crates” that houses the Apple I’s), game paddles, and color graphics on the video display. Being descendants of Ugh, people were fascinated by the bright colors and the flickering images, and the Apple II began to move off the shelves.
Jobs realized that he had started something that could mushroom into a serious business concern, and he laid on more employees, more workspace, and buckled down to the task of meeting the sudden consumer demand for his goodies. When Apple added the inboard floppy disk in 1977 (abandoning the slow and clumsy tape storage facility), the II’s sales really took off, and Apple was suddenly at the crest of a wave of interest in personal computing. Never mind that many novices bought an Apple II without a clear idea of what to do with it… the mere concept of the average Joe being able to own and operate a “personal computer” was catching people’s imaginations.
WILL SOMEONE PLEASE TELL ME WHAT A PERSONAL COMPUTER CAN DO?
— 1982 Apple Computer ad
Which brings us to Jobs’ infamous trek to Xerox’s PARC facility. Actually, we need to look further back in time to set the stage for Jobs’ visit.
The 40s – GUI Forefathers: Bush and Engelbart
Let’s back up to 1945 (!) and a visionary named Vannevar Bush. Bush, a scientist and futurist, went public with his ideas of the “memex,” a computing device that would use what we’d call hyperlink technology to bring information to every user’s fingertips.
Bush’s ideas sparked some visionary thinking in a scientist named Douglas Engelbart. As early as 1962, while Jobs and Wozniak were still drinking Ovaltine and watching Saturday morning cartoons in their jammies, Engelbart was creating several items of interest to the personal computing crowd that would follow. He invented the first “mouse,” which he called an “X-Y Position Indicator,” a little gizmo housed in a wooden box on wheels that moved around the desktop and took the cursor with it on the display. Engelbart saw the mouse as being an integral part of a “graphical windowed interface,” and invented what he called “a windowed GUI” that fascinated co-workers but wasn’t considered useful outside the lab. In 1968 Engelbart created NLS (oNLine System), a hypermedia groupware system that used the mouse, the windowed GUI, hypermedia with object addressing and linking, and even an early version of video teleconferencing to wow its audience, a group of technicians, engineers, and scientific types at Stanford University.
However, Engelbart was not the only visionary in the history of GUI. In 1963 a grad student at MIT, Ivan Sutherland, submitted as his thesis a program called “Sketchpad,” which directly manipulated objects on a CRT screen using a light pen.
“Sketchpad pioneered the concepts of graphical computing, including memory structures to store objects, rubber-banding of lines, the ability to zoom in and out on the display, and the ability to make perfect lines, corners, and joints. This was the first GUI (Graphical User Interface) long before the term was coined.”
–- from a Sun Microsystems biography of Ivan Sutherland
The idea of direct manipulation of objects on a screen is integral to the concept of a graphic interface. In fact, the idea of a GUI derives from cognitive psychology, the study of how the brain deals with communication. The idea is that the brain works much more efficiently with graphical icons and displays rather than with words – words add an extra layer of interpretation to the communication process. Imagine if all the road signs you saw were uniform white rectangles, with only the words themselves to differentiate the different commands, warnings, and informational displays. When the “Stop” signs hardly look different from the “Resume Highway Speed” signs, the processing of the signs’ messages becomes a slower and more difficult process, and you’d have even more wrecks than you have now.
Combine this with Alan Kay’s concept of “biological computing,” where computer components function like organic “cells,” either independently or in concert whenever appropriate, and you have an idea of the thinking behind both modern computing, and the GUI.
The 70s – SmallTalk and Xerox
“The best way to predict the future is to invent it.”
— informal PARC slogan
The underground buzz stayed underground, but Engelbart’s and Sutherland’s creations were not lost on the creative fellows at Xerox’s PARC facility. PARC was (and is), at least in some respects, a computing “think tank,” where brilliant and brilliantly erratic minds cranked out ideas and tried, with varying success, to implement them on the workbench.
In the early 70s, as part of a (sadly abortive) project called “Dynabook” that envisioned notebook-sized, hyperlinked computers, Alan Kay and others developed an interactive object-oriented programming language called Smalltalk. Kay had previously worked with a team at the University of Utah that developed a programming system called Flex. This was a design for a flexible simulation and graphics-oriented personal computer, with many ideas derived from the Norwegian-developed Simula programming language, another programming language called LISP, and Sutherland’s Sketchpad. Kay also borrowed ideas from a highly graphical language called Logo, which was designed to teach programming to children. Smalltalk featured a graphical user interface (GUI) that looked suspiciously similar to later iterations from both Apple and Microsoft.
Smalltalk didn’t stop with an innovation in user interface: it featured a multi-platform virtual machine years before the folks at Sun came up with Oak/Java, object orientation, overlapping “windows,” and the first instance of bit-blt or “bit-blitting,” the last two contributed by Dan Ingalls (the object-oriented language featured in ST actually showed up in the Simula-67 program in the late 1960s; “bit-blitting,” or bit block transfer, is, in simplistic terms, the protocol by which objects on a screen can be manipulated). A lot of observers feel that ST’s clean, easy-to-use interface has yet to be surpassed even today. The first program to be written under Smalltalk was Pygmalion, which is most notable for its demonstration that computer programming could be graphically based and not restricted to text. The idea of using icons to stand for data was reflected in Pygmalion.
The first real-life, usable GUI appeared in Xerox’s Alto computer, which debuted in 1974 and was envisioned as a smaller, much more portable replacement for the mainframes of the time. The Alto, which didn’t have a GUI as you and I are used to using, but instead featured graphically driven applications, was about the size of a Volkswagen (well, not quite, but the thing was big) and certainly not useful for the average user, even though it started its life showing an image of Sesame Street’s “Cookie Monster.” The Alto featured a bit-mapping display, which was essential for displaying graphics and WYSIWYG printing. Kay, David Canfield Smith, Bill Verplank, and others also developed iconic representations for various programs for the Alto, most noticeably the drawing program “Markup,” the text editor “Bravo,” and the painting program “Superpaint.”
In 1981, the design and concepts which gave birth to the Alto led to the development and production of the much more streamlined, and more usable Xerox Star – the first true GUI-driven PC. According to Bruce Horn, an ex-Xerox employee who wound up working for Apple, the software architecture for Smalltalk and the Star were much more sophisticated than the Mac or Windows equivalents. While the Apple machines incorporated much of Xerox’s brainstorms, many of the most innovative and sophisticated ideas never made it into the Apples, mostly due to Apple’s insistence on keeping costs down. The Star featured the first “computer desktop,” as well as overlapping, resizable windows, and the sophisticated PARC mouse, a gee-whiz gizmo that ran with no moving parts and used laser beams and a metal grid to track the cursor’s movement (though employees found that the mouse worked just as well on Levis as it did on the metal grid). The interface was known as WIMP – Windows, Icons, Menus, and Pointers. PARC’s consensus was that once these ideas were implemented on a wide scale, computing efficiency would increase dramatically.
1979 – Apple Visits PARC
Jef Raskin, a project manager with Apple, first told Jobs and Wozniak about the research being done at PARC. It’s a mistake to envision this scene as taking place in some deserted parking garage, with Raskin hiding in the shadows and doing his best Deep Throat impersonation. A closer scenario is that Raskin wanted to work more directly on a GUI, and dropped a bug in Jobs’ ear about the neato keeno work being done at PARC. Jobs was reluctant to go at first, but eventually Raskin, who wrote his master’s thesis on a WYSIWYG graphical interface back in 1967 and was seeing some of his ideas brought to fruition by the folks in PARC, piqued his interest.
At any rate, Jobs, who was first told by Raskin about the fun going on at PARC in 1976, decided that he wanted to bring a team of Appleniks into PARC and see what was causing such a buzz – but again, the idea of Jobs coming in like a kid touring Epcot with a tape recorder hidden under his shirt is mistaken. Apple negotiated a deal with Xerox; in return for a block of Apple stock, Xerox allowed Jobs and his team to tour PARC in December 1979, take notes, and implement some of the ideas and concepts being bounced around at PARC in their own creations. I’m not sure how Xerox felt about Apple subsequently hiring half – perhaps the better half – of PARC’s staff away from them, but the process was relatively above-board; no night kidnappings or bribes under the table at Jack In the Box. Xerox allowed Apple to use their ideas in their machines. As Wozniak says on his Website, “Steve Jobs made the case to Xerox PARC execs directly that they had great technology but that Apple knew how to make it affordable enough to change the world. This was very open. In the end, Xerox got a large block of Apple stock for sharing the technology. That’s not stealing outright.”
“The reason why Jobs got the reputation of being so brilliant in human-centered computing is because he neglected to tell anyone at PARC that his perceptive questions about GUIs and so on were drawn from his discussions of such things with Raskin at Apple a month or two earlier. He masterfully made it appear as though he was encountering bitmapped GUIs for the first time in his life instead of having discussed them with someone who had visited PARC himself.”
— Neil Franklin
At any rate, Jobs and the Apple guys came back from their PARC tour with stars in their eyes. They were entranced with the idea of a “windowing GUI” and loved the flexibility and power of Smalltalk. They had a new vision, and were determined to unleash it on the computing world ASAP. Development immediately began on the Apple “Lisa.”
1979 – Birth of Apple Lisa
Lisa is worth a paragraph or two on her own. Jobs and his buds envisioned Lisa (named for the original chief engineer’s daughter, and also standing for Local Integrated Software Architecture) as the first of a new, GUI-based PC family, but developed her primarily for business use. It’s notable that the new product line came on the heels of the 1981 failure of the Apple III line, which was so flawed that it had to be recalled. Apple had some ground to recoup. The Lisa line featured the warhorse Motorola MC68000 microprocessor which trundled along at 5MHz, boasted 512K of RAM (upgradable to 2MB), had every bell and whistle that the Apple design team could stuff inside her, and cost more than $10,000. Lisa was rather large and clunky, though many veterans of the PC wars insist that she is still one of the most efficient and usable machines of her type ever built.
Initial development on Lisa began before the 1979 field trip to PARC (Raskin says that Lisa was first envisioned as a text-driven PC along the lines of the Apple II), but she didn’t appear on the market until January 1983. Eventually the cheaper, pared-down Lisa2 appeared, but neither sibling did well on the market – they were too expensive, and the Apple II family was still riding high on the market, even with the competition from other machines like the Commodore 64 and VIC-20, the IBM PC, and the Radio Shack TRS-80. Even later, after the Macintosh had begun to take the PC market by storm, Apple decided to unload some of their Lisa stockpile by repackaging it as the “Macintosh XL.” The buyers weren’t fooled, and many Lisas ended up in a California landfill. Interestingly enough, Lisa featured a set of integrated software called “7/7,” that included a word processor, a spreadsheet, chart builder, outline manager, project scheduler, drawing program, and modem communication utility. 7/7 may well have been the first integrated “works” package.
“A few months after looking at it [the Xerox Star] we made some changes to our user interface based on ideas that we got from it. For example, the desktop manager we had before was completely different; it didn’t use icons at all, and we never liked it very much. We decided to change ours to the icon base. That was probably the only thing we got from Star, I think. Most of our Xerox inspiration was SmallTalk rather than Star.”
— one of the Lisa development team
Jobs and the Lisa design team worked hard to integrate the Xerox/PARC concepts they had obtained into their own design. Lisa’s GUI was, indeed, based on Smalltalk as it ran on the Alto, but much of Lisa’s design was Apple’s own, including click-and-drag capability, and the pull-down menu — this according to Jef Raskin, who headed the Macintosh design team and should know, but other sources give the credit for click-and-drag and pull-down menus to PARC. Whether this is another example of PARC’s ideas being implemented at Apple, or it’s an example of side-by-side independent development is uncertain. As they say, it steamboats when it’s steamboat time. Apple also worked with psychologists, artists, teachers, and ordinary users to improve their interface. In one famous example, Apple provided a California elementary school with free machines for every student’s use. During the summers, the Apple programmers worked with the teachers and kids to enhance the software and the GUI, because they felt that kids gave the truest reaction to basic interface issues, e.g. “These menu things are cool!” or “That picture sucks!”
“The [Lisa] user will be able to carry out many functions simply by pointing to a picture of what he wants done rather than typing instructions.”
— Time Magazine, 1983
1983 – Mac Arrives
Jobs was no longer the only alpha male in the Apple pack (if he ever was). John Sculley, the corporate executive brought in to reshape Apple into a “grown-up” business, took Jobs off the Lisa project because of Jobs’ poor project management skills, and turned him loose on the next Apple project, a slimmed-down and considerably cheaper “daughter” of Lisa, eventually to be known as the “Macintosh.” The Mac was named for team leader Jef Raskin’s favorite strain of apple, but spelled differently in order not to offend audio manufacturer McIntosh. Under development since September 1979, the Mac lost much of Lisa’s bulk and price tag (the first Mac sold for $2500), and was the first popular PC to feature a graphical user interface. The Mac also bundled MacPaint, which brought computer “art” design to the average user (and not unimportantly, sold the average user on the mouse), and MacWrite, a simple word processor that was the first WYSIWYG product of its kind on the consumer market.
Raskin left Apple in 1982, but the Mac team labored on, and the Mac hit the market in January 1984, heralded by the famous “1984” commercial that aired during the Super Bowl and depicted the Apple PC demolishing the gray, faceless world of IBM computing. Prophetic. Many average users fled screaming from the aggravating world of the DOS command line to the friendly Mac GUI, and while power users and DOS fans dismissed the Mac as a Playskool product, the Mac’s user-friendly interface made friends throughout all levels of the computing community. Later iterations of the Mac boosted the underpowered 128K of RAM, giving it the gumption it needed to compete with the button-down IBM machines. In 1986, Aldus released its desktop publishing app, PageMaker, for the Mac, and the Mac suddenly became everyone’s PC of choice for graphic arts and desktop publishing. GUIs were all the rage (later made even more tasty by the addition of color displays in the Mac II), the Mac ruled the PC universe, Microsoft was scrambling to catch up, and all was right with the world. Even though Jobs had been forced out of Apple in May 1985 by no-fun CEO Sculley, Apple was riding the tiger.
There are supposedly reliable sources that claim everything original in the Macintosh was cooked up at PARC and transposed wholesale into the Mac; other equally “reliable” sources claim that the Mac is virtually a homegrown Apple creation, with very little influence from PARC-generated concepts. Both ideas are wrong; it’s plain that the Mac is a product of intense cross-fertilization between both creative sources. As Raskin says, “The years of study, thinking, and experimentation by many talented people on the Macintosh project – and elsewhere – have gone largely unreported, though they led to the breakthroughs that made the Macintosh and the systems that have been built since its introduction so much of an improvement over what went before. Against this complex reality we have the powerful mythological image of Jobs drinking from a Well Of All Knowledge, having an ‘aha!’ experience and coming back at full cry to Apple to create a fantastic project.” In fact, the Lisa owes more of a creative debt to the PARC designs than does the Mac. Many of Lisa’s features were borrowed wholesale from PARC, down to the fonts and their nomenclature. As Raskin notes, “We were somewhat more pure while I was running the Mac project.”
“The future lies with a graphical windowing interface, mouse cursor control, pull-down menus, dialog boxes, and the like [and computers based on such interfaces] are destined to take over the IBM PC and compatible world as well.”
— W.F. Zachmann, 1987
In the Meantime at Microsoft…
Meanwhile, in the Pacific Northwest, a great evil was stirring… Oh, please. To cast Microsoft and its head honcho Bill Gates as the Great Satan, or as Sauron to Apple’s brave little band of hobbits, is ridiculous. Both co-founders, Jobs and Gates, are much more alike than they are different. Neither one is a lily-white altruist just trying to bring personal computing to the masses, nor is either a black-moustachio’ed villain bent on destruction. While I doubt either Jobs or Gates would recognize a code of ethics if it hit them in the mouth, neither one belongs on the Ten Most Wanted List, either. Both wanted to carve out a place for themselves in the PC market, both were willing to cut corners to get what they wanted, and both were tremendously successful at what they did.
“640K ought to be enough for anybody.”
— Bill Gates, 1981 (possibly apocryphal)
Microsoft began just as small and insignificantly as Apple did. Starting out as a two-man operation out of the backseat of Bill Gates’s car, Gates and cohort Paul Allen saw the MITS Altair and in the span of a month had a BASIC interpreter ready to go for the beastie. The code wasn’t tested until they demonstrated the program for MITS, and Allen’s first time even touching an Altair was when he inputted the code into MITS’ machine. MITS bought the product – the first programming language written specifically for a personal computer – and Allen joined MITS as Director of Software. By July ’75, BASIC 2.0, a Microsoft creation, was running the new, more powerful Altairs. The name “Microsoft” wasn’t chosen until November ’75.
ALLEN: “We would almost always overestimate our competitors’ ability to compete.”
GATES: “Or we’d assume that they were going to execute competently.”
— from a 1995 interview with Bill Gates and Paul Allen
1977 – Microsoft and Apple Team Up
Allen rejoined Microsoft in time to christen the company’s new offices in Alberquerque. In early 1977 Microsoft licensed “AppleBASIC” to Apple for the flat fee of $21,000, which turned out to be a steal of a deal, as Apple sold over a million computers with AppleBASIC running the show (Wozniak actually wrote the integer BASIC for the early Apples). By the end of 1979, Microsoft had participated in porting both FORTRAN and COBOL languages to microcomputers, moved to Washington State, entered into agreements with ASCII Corporation of Japan, and expanded into Europe. The two-man operation was now employing 40 people and bringing in over $7 million. Microsoft’s congenial association with Apple continued into the 1980s, with Microsoft bestowing the Z-80 SoftCard upon Apple in 1980. The SoftCard allowed the Apple II to run most of the CP/M programs currently featured on most smaller computers.
Interestingly, Microsoft was working out the details of a secret deal between themselves and Big Blue for a new operating system, which they called DOS (Disk Operating System). MS-DOS (which was spawned from an operating system called Q-DOS written as a CP/M knockoff by Seattle Computer Products, and bought by Paul Allen in 1980) appeared as the operating system for the first IBM machine, the IBM PC, in August 1981. Since Gates had insisted on keeping the rights to MS-DOS for his company, he was able to license the operating system to any number of “clone” computer and application manufacturers. IBM made an effort to keep DOS to themselves by releasing machines that ran their own version, PC-DOS, but with Microsoft’s willingness to license MS-DOS to all comers, PC-DOS never caught on. As late as 1993, IBM was still trying to market PC-DOS as a viable alternative to the Microsoft operating systems, but by then DOS was waning in market appeal – mass-market users liked the various GUIs and had little use for further command-line interfaces) At the end of 1981, Steve Jobs paid a visit to Microsoft to give them a look at the embryonic Mac, and authorized Microsoft to develop apps for the new, GUI-based system. From 1981-1984, Microsoft folks were all over the Apple labs, working alongside Apple techs to develop applications for the Mac. In the process, Microsoft acquired an intimate familiarity with the inner workings of the Mac design.
A note on the above: Microsoft’s DOS 1.0 code structure was virtually a clone of Digital Research’s CP/M 1.4 operating system…one source calls it a “bug-for-bug” copy. Digital Research (DRI) began working on an updated version for 16-bit computers called CP/M86, to be used with machines featuring Intel’s 8086 processor; unfortunately for DRI, CP/M86 wasn’t ready for prime time when IBM came looking for an operating system, and they went with Microsoft’s DOS instead. In 1982, Digital Research finally released CP/M86, and converted it to their own DR-DOS system in 1987. Digital Research sued Microsoft over the CP/M – DOS imbroglio, but the lawsuit fizzled. One source very hostile to Microsoft alleges that Microsoft did their level best to sabotage DR-DOS when it was released, including making spurious claims that Windows would not run under DR-DOS, as well as hustling their own updates to MS-DOS onto the market to cut the legs out from under Digital Research’s product, and using illegal marketing practices to force PC manufacturers to use their own system in lieu of DR-DOS.
Naturally, this isn’t the only version of this story, but the bare facts are that DR-DOS never impacted the market in the way that Microsoft’s competing MS-DOS did, at least partially due to Microsoft’s energetic and possibly underhanded attempts to push their own system over DRI’s. Digital Research later sold DR-DOS to Novell in 1991. After attempting to integrate it into their own Networking Operating System and releasing versions under the name “Novell DOS,” Novell sold it to Caldera in 1996, almost three years after Novell’s final attempts to work with DR-DOS. Caldera transformed DR-DOS into an open-source product, called OpenDOS. Caldera also sued Microsoft for illegal marketing practices over the DR-DOS affair, and Microsoft settled the lawsuit out-of-court in January 2000. Had CP/M86 been ready for use when IBM came calling, it’s possible that Microsoft would never have gotten the “in” with IBM that propelled it to glory, and we’d all be cussing Digital Research today, instead of Microsoft. Who can say?
Two months before the Macintosh officially hit the market, in November 1983, Microsoft announced that it was working on its own GUI-based operating system (actually, a “shell” that rode atop the DOS OS) to be known as “Windows” (which Gates wanted to call “Interface Manager,” but slicker heads prevailed). Microsoft had already caused a stir in April ’83 by giving a “smoke-&-mirrors” demo of their prototypical Interface Manager, using overlapping windows to simulate multiple programs running simultaneously. IBM executives were not happy with Microsoft’s little toy, as they were working on their own DOS-based program manager, to be called “Top View.” Gates had tried repeatedly to interest IBM in Windows, and was rebuffed each time; IBM felt that the interest in GUIs was a passing phase. Top View was released in 1985 and discontinued in 1987; its graphical interface influenced IBM’s much more noticeable OS/2, even though a GUI-driven version was never made public.
Windows 1.0 made its official debut almost two years after it was announced, in November 1985. Apple was stunned by the similarities between the Mac and Windows interfaces, but as there were almost no applications available for the Windows environment (Aldus’s PageMaker for Windows was a notable exception), Win 1.0 came and went on the consumer market without much fanfare. The failure of Win 1.0 to capture a decent market share, along with plateauing Mac sales, caused some to wonder if the GUI craze was a fad that had peaked. Ironically, in light of the bad blood to come between the two companies, Microsoft’s Excel (a GUI-based spreadsheet that was similar to its predecessor VisiCalc, but easier to use) gave the Mac much-needed viability at this time.
1983 On – Other GUIs Hit the Market
Were the Mac and Windows GUIs the only ones on the market? Hardly. In fact, the first consumer-oriented, PC-based GUI was made not by either company, but by VisiCorp, the makers of VisiCalc. Called VisiOn, it debuted in October 1983, shortly before the Lisa, but was crippled by the lack of popular software written to run under it. The same story can be told of DRI’s GEM (Graphical Environment Manager), which appeared in September 1984 and disappeared shortly thereafter, partially because it, like VisiOn, lacked the ability to run DOS apps, and had no software of its own. Worse luck for GEM: Apple didn’t like GEM’s similarity to the Mac desktop, and threatened to sue. Rather than fight, DRI revamped the GEM desktop to get Apple off its back. Both VisiOn and GEM had their proponents, but neither made a major dent in the consumer market, which continued to be dominated by the twin monoliths Apple and Microsoft .
And there was Quarterdeck’s DESQView, the first program to bring multitasking and windowing capabilities to a DOS environment. DESQView wasn’t a full-fledged GUI OS, but its GUI “shell” over DOS won many fans and intrigued many folks at Microsoft, including Gates, who by some accounts based his first iteration of Windows as much or more on VisiOn, GEM, and DESQView than on the Mac interface (this conflicts with the stories passed around the campfires of the Apple fans, who portray Gates as a petty thief who snarled to his Windows team, “Make it look just like a Mac!”). Berkeley Softworks’ GeoWorks (GEOS) is another GUI OS worthy of note; it was used on the Commodore 64, some Apple IIs, and still survives in an altered form as software for the PalmConnect system. GEOS was lauded as a slick, stable operating system, but the lack of software for it – developer software did not appear for six months after GEOS’ debut – ensured that most PC users never gave it a second thought.
Apple was not happy at all with Windows. Even before the system appeared on the shelves, Apple was threatening Microsoft with lawsuits that alleged patent infringement, intellectual theft, what have you. In an ingenious move, Microsoft signed a licensing agreement with Apple that stated Microsoft would not employ Apple technology in Windows 1.0, but made no such agreement for further versions of Windows. It took a while for Apple to realize that Microsoft had thoroughly skunked them; the realization took longer to hit because of Windows’ dismal failure on the consumer market.
Nevertheless, both Apple and Microsoft forged ahead with their own plans for world domination…er, rather, their plans to expand their niche of the PC market. As always, though, these two were not the only bands marching in the parade. In 1985 Commodore launched its Amiga line of home PCs, and won the hearts of millions of users. The Amiga was the first PC to truly introduce the idea of “multimedia” into PC-dom, although since most users didn’t know what to make of their new multimedia capabilities, they played games on it instead. Great-looking games. Amiga’s advanced sound and video capabilities went along with its sophisticated GUI-driven OS (which also featured preemptive multitasking, shared libraries, messaging, scripting, multiple simultaneous line consoles, a real use for the right mouse button, and other features not found in the Apples and IBMs of the day). To add insult to injury, Amiga featured Apple/IBM interface emulation. Apple or IBM users who preferred their old interface could have Amiga mimic that look instead of its own.
So why didn’t Amiga wipe both Apple and IBM/Microsoft off the PC market? As usual, we have a patchwork of reasons. The best guess is that Amiga made the same mistake as the Tucker passenger auto made… it was too far ahead of its time too fast, and couldn’t take advantage of its own capabilities. The heated competition that existed between Amiga and Atari worked to Microsoft’s advantage, as did Amiga’s spotty ability to keep their dealers and customers happy. Adding to Amiga’s problems were the first machines’ failure to settle on a single GUI (one Amiga user tells me that the early models had different interfaces depending on which program was running). But whatever the reasons, Amiga was one sharp puppy, and deserved a better fate – though today Amiga is neither gone nor forgotten; a new OS called “The Digital Environment” is being touted as the next step in GUI-driven operating systems. We may hear from Amiga again before all is said and done.
Yet another mid-80s contender in the GUI wars was the Atari ST. Atari, much better known for their video games, produced a PC that featured the GEM OS. Like the Amiga, the ST couldn’t compete with the big boys, nor could it compete with Amiga for gamers, but its sophisticated sound processing capabilities earned it a niche with audio editors and musicians.
Sometime around the debut of the Amiga, the first UNIX GUI appeared as well. Many UNIX heads had long sneered at the simple-minded, overly convoluted operating systems and playtoy PCs that were populating the consumer market. But some UNIX users decided to see if they could overlay a GUI on UNIX in the same fashion as Microsoft overlaid Windows atop DOS, and thus X was born. X (sometimes called “X Windows,” and sometimes incorrectly called “X for Windows”) was born at MIT, fathered by a Stanford University windowing system called W and mothered by Sun’s “SunView” environment. X became the main graphics system for most RISC-based UNIX operating systems. While X was a well-written and easily handled OS shell, it never settled on a particular “look and feel,” and as a result at least three different interfaces, or “windows managers,” floated around for it.
This isn’t the main reason why X never caught on much outside the UNIX community, but it’s certainly one reason. X is still a viable GUI shell, and has a relatively small but vocal following. X is making something of a resurgence among UNIX users: the battle between “windows managers” has shaken itself out, the interfaces are more polished and easier to use, and it’s very useful for high-end computer graphics production. X is also the underlying GUI for most Linux graphical interfaces. The “several GUIs” are more correctly known as the various *nix windows managers, and users can run desktop environments such as Gnome or KDE for additional functionality. X-driven interfaces are popping up in such non-PC devices as TiVo, Web pads, and PDAs, and one X user speculates that as these devices become more widespread, we may see X actually being used more than either Apple or Windows GUIs.
It’s also worth mentioning that Three Rivers Computing Company manufactured a graphics workstation called PERQ in 1981 that incorporated a UNIX-based GUI, and was marketed in the U.K. by ICL. This GUI actually predates all of the above, including VisiOn, but as far as I know, it was never made available for personal computers.
Completists will point out that IBM’s MVS (Multiple Virtual Systems) mainframe system included an optional program known as ISPF (Interactive Structured Programming Facility) that allowed split-screen windows to be supported on terminal displays. Considering that ISPF was created in the late 1970s, it’s one of the first “windows-like” systems that became available. Of course, it’s highly unlikely that home PC users would have ever seen this.
It’s worth noting that many, many graphically-driven applications were released independently of any of the abovementioned systems. One of the very first was Bill Budge’s Pinball Construction System, which appeared in 1985 for the Atari and quickly became famous among both gamers and programmers for its sophisticated ability to manipulate objects using click-and-drag. Programs like PCS made their mark on the operating systems that followed them into the PC marketplace.
“I had an enormous reservoir of goodwill towards Microsoft because it and it alone – unlike Xerox, Apple, Amiga and many others who tried before it – was the one that finally delivered a usable graphical interface on ubiquitous, inexpensive hardware. Microsoft often wasn’t the first, and its software wasn’t often the best, but it was inarguably the one that delivered on the early promise of personal computing in a way no other software maker did. Microsoft – more than any other company – opened up computing for ordinary people. I loved Microsoft for that.”– Fred Langa
Back to the big guys. December 1987 saw the release of Windows 2.0, to the consternation of Apple but the yawns of the consumer. Although Win 2.0 looked more like the Mac than ever, with icons representing files and programs, cascading windows, and the like, Mac users weren’t leaving the Apple flocks to buy the IBM/Windows machines (especially since the hunky Mac II’s were all over the shelves). Apple hemmed and hawwed a few more months, and finally sicced the long-threatened lawsuit on Microsoft, claiming that Windows stole the Macintosh’s “look and feel.” 1988 saw the market all but ignore Apple’s GS/OS for the Apple IIGS, but the Mac continued to dominate the market. By 1989, the general buzz was that Windows was a mammoth flop. Microsoft continued to work with IBM in developing the fully graphical OS system, but kept pounding on Windows, hoping to eventually get one version right. As the cliche says, “even a blind squirrel finds an acorn every now and then.” It was about to be acorn time in Redmond.
Windows in the 90’s
“I think Windows 3.0 will get a lot of attention; people will check it out, and before long they’ll all drift back to raw DOS. Once in a while they’ll boot Windows for some specific purpose, but many will put it in the closet with the Commodore 64.”
— John Dvorak, 1990
A great, orchestrated hullabaloo welcomed Windows 3.0 to the market in May 1990. Steve Ballmer led the chant of “Windows! Windows! Windows!” at Microsoft HQ in Redmond, Washington; the great beast that was, and is, Microsoft’s marketing machine took care of the rest of the world. Microsoft unveiled dozens of applications written specifically for Windows at the same time it released the new version of Windows, which now featured the OS/2 style “sculpted buttons” – credit to icon designer Susan Kare for the much more appealing button styles; Kare also worked on the Mac – more color support, real multitasking, and a much-improved program manager, among other things. These new features and fresh software releases finally got the market’s attention. Impelled by the popularity of its own Win-compatible versions of Word and Excel, and numerous other 3rd party apps, Microsoft sold over 3 million copies of Win 3.0 in its first year of release, and Apple was feeling the chill. Win 3.1 (April ’92) added scalable TrueType font support and better multimedia capabilities, and Apple was on the run. For the first time, Windows-equipped PCs were outselling the Macs. Windows 3.1.1, called “Windows for Workgroups,” did relatively well in the corporate world, as well as bequeathing much of its design to later versions of Windows.
1993 saw the first version of Windows NT (New Technology), which abandoned the crash-prone kernel of its predecessors for a new, much more stable kernel. NT started out as a new version of IBM’s OS/2 system, part of Microsoft’s and IBM’s joint venturing. It was originally known as OS/2 3.0 or OS/NT, but during early development, Microsoft and IBM split, and Microsoft walked away with the program, combining IBM’s OS concepts with their own, rewriting the code, and eventually releasing it under the Windows umbrella. Problem was, Microsoft marketers couldn’t decide what to do with it. Obviously it was more useful for business usage, so, being Microsoft, they tried to sell it to anyone but business users. It quickly became known as “Windows No Thanks”, and catty observers decided that Microsoft had shot itself in the foot. Not so fast… it turns out that a lot of people who were using Unix had decided to give NT a go, and liked what they saw. By early 1995, many European corporations had shifted over to version 3.5, the second “official” version of NT. By mid-1995, NT had established itself among technical and business users, and by the time the “bulletproof” (read: virtually crashproof) version 3.51 was available, Win NT was firmly entrenched. NT worked very well in the corporate and office environment, but less so in the home: it wasn’t engineered to run older DOS-based software, which made it the wrong choice for gamers and folks with less-than-cutting edge software.
The lawsuit wasn’t going well, either. Apple’s strategy was to prove that Windows had illegally copied the “look and feel” of the Mac GUI, but that strategy sprang some significant leaks after the Microsoft lawyers pointed out that both systems “borrowed” liberally from the original Xerox concepts. To Jobs’ accusation of theft, Gates made the damning retort, “No, Steve, I think it’s more like we both have a rich neighbor named Xerox, and you broke in to steal the TV set, and you found out I’d been there first, and you said, ‘Hey, that’s no fair! I wanted to steal the TV set!'” The fact that Windows’ interface design looked, if anything, more like the old Alto GUI than the Mac designs didn’t help Apple’s case. Suddenly Microsoft was the buzz, and Apple seemed to be yesterday’s news.
Apple’s Torrid Ride
The long-running lawsuit was finally settled in Microsoft’s favor in June 1993, and the doomsayers thought that Apple’s fate was sealed. Wozniak and Jobs were long gone. The company was in financial trouble (though the reports of imminent bankruptcy were wrong). Their long-anticipated “Newton” personal data assistant was a bust. Management seemed more interested in fighting among themselves than righting the company. Orders went unfilled due to production problems. Some predicted that Apple would fade into complete irrelevance when, in August 1995, Microsoft unveiled its groundbreaking Windows 95 OS. Win 95, the first operating system to take advantage of Intel’s powerful 32-bit chips, and a near-clone of the Mac GUI, seemed to be the irresistible force destined to finally run Apple out of business once and for all. Apple tried to recoup by pushing its “Performa” line of low-end PCs (basically older, repackaged Macs) over its higher-end “PowerPC” line, and failed miserably – Performas sat gathering dust in the Apple warehouses, while buyers found it difficult, if not impossible, to get hold of the PowerPCs they wanted. More and more first-time users chose Windows-driven PCs over Apples, in large part because the fierce competition between the Windows-clone manufacturers were keeping the Windows machines’ prices relatively competitive, while Apple’s relentless refusal to let others manufacture clones (only partially loosened in 1994 and yanked in 1999), its embarrassing quality-control problems, and its comparatively high sticker prices, soured many buyers on the Apple name. Microsoft’s decision to slap a modified version of the Win 95 interface onto Win NT 4.0 boosted the NT platform’s popularity, and detracted that much more from the Apple market share.
In October 1994, IBM tried unsuccessfully to yank some of Microsoft’s market share with the third version of its own operating system, OS/2 Warp. OS/2 was originally a Microsoft/IBM joint venture, but Warp was IBM’s own offering, and featured a Windows-like GUI. It managed to stay afloat and win some loyalty, but it never really became anything more than a weak alternative for IBM-machine users who didn’t want to use Windows. And while we’re on the topic of weak alternatives, now’s the time to give a sardonic nod to Microsoft Bob, the “next-generation” GUI that rode atop of Windows and “assisted” novice users with a happy, chatty, virtual assistant named Bob. Bob tanked hard, and became a figure of fun among the computer cognoscenti. His only legacy was the equally annoying Office Assistant, the “dancing paper clip” that currently plagues Microsoft Office.
“Software is getting to be embarrassing.”
-– Alan Kay
Meanwhile, Apple wasn’t cored just yet. Steve Jobs had founded a company called NeXT, and while the NeXT computer failed in the marketplace, the sleek and sophisticated NeXT OS (an OS built on the UNIX MACH kernel and featuring a fabulous GUI) was quite attractive to Apple. Apple’s Mac operating system was showing its gray hairs, and Apple wanted something new and glitzy to throw up against Microsoft’s monolithic offerings. In December 1996, Apple bought out NeXT, thus acquiring NeXTStep, elements of which would turn up in the new Mac OS, Rhapsody. Jobs came along for the ride, and it wasn’t long before he was again at the helm of the company he had founded.
Apple fans weren’t happy with the return of their hero for long. In August 1997, Jobs announced a formal liaison with Microsoft, to the dismay of the rank and file. Microsoft bought $150 million of Apple stock, and both companies agreed to once and for all end the GUI dispute. Many disgruntled Apple users, already disturbed by Apple’s continuing inability to build enough machines to fill orders, along with Apple’s failure to license the Apple system to clone manufacturers, leapt off the Apple bandwagon once and for all; it’s no coincidence that the new surge of interest in “alternative” OS’s such as Be and Linux began about the same time as Jobs’ perceived “sellout” of Apple to Microsoft. Certainly the ex-Apple minions fleeing their former home didn’t start the Linux/Be/etc. buzz, but they made their contribution, especially when Apple yanked its support from the Be platform, originally designed to run on the PowerPC.
Note, though, that most Appleniks didn’t dive overboard solely because of the “sellout” of Apple to Microsoft. The more knowledgeable in the Apple community understood that Microsoft’s commitment meant more stability for Apple, as well as continued development of MS Office for the Mac. Microsoft’s $150 million purchase was a small fraction of the Apple stock base, and Microsoft’s shares were strictly non-voting. According to some stories, Apple fans had a reaction similar to a church congregation whose preacher announced that Beelzebub was being named as head deacon. But the real reasons why so many Apple fans were disaffected are much more complex.
Apple gained ground with the successful release of the Mac OS 8 “Platinum,” a popular and stable OS. Unfortunately, some of that ground was lost in the confusion that followed OS 8. The original idea was to give Apple users a “next-generation” system to be called Copland. Instead of releasing it in mid-1996, Apple squelched the project in favor of working with the newly acquired NeXT OS. Apple then announced a new system under development, Rhapsody, which would combine elements of the NeXT OS on top of a UNIX core. The aforementioned OS 8 appeared in July 1997, and featured some of the more touted elements from Copland. Rhapsody also failed to materialize, and eventually transmogrified into the OS X project, announced in May 1998. An upgrade to the “Platinum” system, OS 9, was released in October 1999. The next-generation OS X was finally released for the PC in March 2001, and many of the features promised for Rhapsody appeared in this system.
In 1998 Apple reinvented the PC market with the now-ubiquitous iMac. The iMac, withas its cutesy color scheme and user-friendly design, won the hearts of many users and the scorn of many reviewers. “Serious” Apple users gravitated to the ever-more powerful PowerMac line, but millions chose to perch the cute little iMacs on their desks. Along with the modestly successful PowerMac and PowerBook laptop/notebook PCs, the iMac gave Apple the recharge it so badly needed. The iMac comes with either OS 9 or OS X installed. The trend of other folks copying Apple continued, with Daewoo’s eMachines drawing Apple’s fire for looking and acting like the iMac. This time Apple prevailed.
Today…
In Redmond, Microsoft barrels along. Its hyped-to-the-max release of Windows 98, launched in June 1998, failed to live up to expectations. Most users, who were led to expect a revolutionary new product, were annoyed when Win 98 proved more of an upgrade than a groundbreaking new product. Feelings are much the same about Windows 2000, or “Win 2K,” the latest and last iteration of the NT line, and even more so about Windows Millennium, which after all the ballyhoo settled, turned out to be little more than a minor upgrade to Win 98. Microsoft is betting its OS fortunes on its integration of the 9x and NT lines in its upcoming Windows XP system. Microsoft hasn’t fared so well in its latest court outing, with the Department of Justice doing what Apple was unable to do, obtaining a ruling that Microsoft was a monopoly and was acting against the best interests of the market and the competition (though as I write this, the DOJ has abandoned its attempts to break up Microsoft). Apple continues to push their latest iteration of the Mac OS, OS X, and promises a new version, 10.1, by the time you read this. Both corporations are poised to continue their dominance of the PC market for the foreseeable future, though UNIX fans remain firm and Linux is steadily gaining ground.
And the future of the GUI? Well, considering that well over 90% of the world’s users employ one GUI-faced OS or another, considering that new GUI-driven OS’s such as Be seem to be catching on, and considering that this season’s darling, Linux, is usually used with any of several GUI’s from Caldera, Corel, and Red Hat, among others, the future for the GUI seems secure. Comrade Gates and others have proposed a much more “involved” interface, with voice recognition, touch screens, retinal and fingerprint scans for security, holographic representations, and virtual “avatars” that interact with the user much more directly than, say, that damned MS Office paper clip. Ugh would be pleased. I’m not sure what Glug would think.
Note: I am indebted to the many people who responded with commentary, corrections, and criticism of the original version of this article. In particular, Jef Raskin was of enormous assistance through the revision process; I appreciate both his cooperation and his patience. The many respondents on the Slashdot message boards were also very helpful, as were the dozens of people who took the time to e-mail me with their own commentary and enlightenment. Thanks to one and all.
– MT, September 7, 2001
Frequently Asked Questions (FAQs) about the History of GUI
What was the first graphical user interface (GUI)?
The first graphical user interface was developed by researchers at the Xerox Palo Alto Research Center (PARC) in the 1970s. This interface, known as the Xerox Alto, was a revolutionary development in computer technology, introducing features such as windows, icons, and menus that could be manipulated with a pointing device. The Xerox Alto was not commercially successful, but it laid the groundwork for the GUIs we use today.
How did the development of GUI impact computer usage?
The development of the graphical user interface significantly impacted computer usage by making computers more user-friendly and accessible to non-technical users. Before GUIs, users had to type in commands to perform tasks on a computer. With the introduction of GUIs, users could now interact with computers visually through icons and windows, making computers easier to use and understand.
What role did Apple and Microsoft play in the evolution of GUI?
Both Apple and Microsoft played significant roles in the evolution of the graphical user interface. Apple was the first to successfully commercialize the GUI with the release of the Apple Lisa and later the Macintosh. Microsoft followed suit with the release of Windows, which became the most widely used GUI in the world. Both companies contributed to the development and popularization of the GUI, making it a standard feature in modern computers.
How has GUI evolved over the years?
The graphical user interface has evolved significantly over the years. Early GUIs were simple and utilitarian, with basic windows and icons. Over time, GUIs have become more sophisticated and visually appealing, with features such as 3D effects, animations, and transparency. Today’s GUIs also support touch input, allowing users to interact with devices through gestures.
What are some examples of modern GUIs?
Modern examples of graphical user interfaces include the interfaces used in Windows 10, macOS, and various Linux distributions. These GUIs feature advanced graphics, animations, and effects, and support a wide range of user inputs, including mouse, keyboard, and touch. Mobile operating systems like iOS and Android also use GUIs, designed specifically for touch input.
What is the difference between a GUI and a command-line interface (CLI)?
A graphical user interface (GUI) allows users to interact with a computer or device using graphical icons and visual indicators, as opposed to text-based commands used in a command-line interface (CLI). While GUIs are generally more user-friendly and intuitive, CLIs offer more control and flexibility to experienced users and are often used for administrative tasks.
How does a GUI work?
A GUI works by providing a visual way for users to interact with a computer or device. It translates users’ actions (like clicking a mouse or pressing a key) into commands that the computer can understand. The GUI then provides visual feedback to the user, showing the results of these commands.
What is the future of GUI?
The future of the graphical user interface is likely to involve more natural and intuitive forms of interaction. This could include voice control, gesture recognition, and even virtual or augmented reality interfaces. As technology continues to evolve, we can expect GUIs to become even more immersive and user-friendly.
What are the advantages and disadvantages of GUI?
The main advantage of a GUI is its ease of use, especially for non-technical users. GUIs are visually intuitive and do not require users to memorize commands. However, GUIs can be less efficient than command-line interfaces for certain tasks, as they require more system resources and can be slower to navigate for experienced users.
What impact did GUI have on software development?
The advent of the graphical user interface had a profound impact on software development. It led to the development of new programming languages and tools designed to create graphical applications. It also shifted the focus of software design towards user experience, making software more accessible and user-friendly.
Mike is an educator, freelance writer, and self-taught PC user who maintains a Windows resource site at ToeJumper. His hobbies include basketball, politics, and spoiling his cats.