LINKSHow to print this page
Living History: Retracing the Evolution of the PC and PC Magazine
In 1982, a year of great promise for the budding personal computing market, PC Magazine published its premier issue. A machine introduced by IBM the year before had sent the market soaring into unforseen territory. Over the next 20 years, PC Magazine chronicled the movements of this constantly evolving market and watched as the personal computer made its way from the basements of hobbyists to the business offices and living rooms of mainstream America. Editor-in-chief Michael J. Miller looks back on the history of this amazing invention.
1982: Hello, World
On August 12, 1981, a star was born. IBM introduced its Personal Computer to a market still in its infancy; little did anyone know that it would trigger the true beginning of the PC revolution. By early 1982, when the first issue of PC Magazine arrived on newsstands, the IBM Personal Computer was clearly a runaway hit.
IBM may have redefined the market, but it certainly wasn't the first to the party. In fact, the company felt it had waited too long. MITS had created the market back in 1975 with the MITS Altair. The Apple II, the Radio Shack TRS-80, and a whole series of computers based on the Z80 microprocessor and CP/M operating system soon followed. But these machines catered mostly to hobbyists and a few brave pioneers who figured out that programs such as VisiCalc and WordStar could make their work much easier.
It was big news when IBM, which dominated the world's mainframe computer market, decided to try its hand at selling to the personal computer market. At the time, its main competitors were Digital Equipment Corp., Hewlett-Packard Co., and other makers of minicomputers. All of these companies relied heavily on direct sales forces to sell their own technology to large business accounts.
As personal computers began to show up in business offices, William C. Lowe, laboratory director of IBM's Entry Level Systems (ELS) unit in Boca Raton, Florida, told IBM executives that the company needed to get into the new market and get there quickly.
The executives agreed, and in August 1980, they gave him approval to move forward with Project Chess. Lowe turned to Philip D. "Don" Estridge and a small team of engineers to create the IBM PC, then code-named Acorn. Because the team was short on time, it used a number of components created by outside companies. It chose the Intel 8088 chip, because it wanted to use 16-bit processors, not the 8-bit processors that were being used with other PCs on the market. It approached PC software companies for software, eventually favoring BASIC and an operating system owned by Microsoft, then five years old.
These decisions resulted in a product that was both more powerful than its competitors and more open. With a list price of $1,565, the first IBM PC packed an 8088 CPU, 16K of RAM, and a single-sided, 160K floppy disk drive. Perhaps more important, it offered plenty of room for growth.
The PC was unveiled with great fanfare, and it began shipping in October 1981. By the time PC Magazine premiered in February 1982, the PC was a smash hit.
That first issue included a look at all the new PC products and accessories introduced at the third annual Comdex show. Tecmar, a company headed by Martin Alpert, introduced 20 PC accessories. Among them was "an expansion adapter that could pass for the PC System Unit's twin—until you peeked inside and saw a 5-million-character Winchester storage disk, where the PC has its diskette drives," according to PC Magazine. Yep, a 5MB hard drive; computer users could barely imagine such an immense amount of storage. Other new hardware included color displays; at the time, that meant text and crude graphics in various colors.
The growing software market was also reflected in that first issue. It featured a lengthy interview with Microsoft's Bill Gates, under the headline "The Man Behind the Machine." (To read this interview, as well as an interview with Estridge from the second issue, check out www.pcmag.com/20years.) And the first word processor for the PC earned the review title "Not-So-Easywriter."
By the end of the year, enough word processors were on the market—including Volkswriter, WordStar, and WriteOn—for PC Magazine to write a comparative review. And other programs, such as Microsoft Flight Simulator and Norton Utilities, made their debuts at about the same time.
Several competitors took on DOS, which was called PC-DOS when marketed by IBM and MS-DOS when sold by Microsoft. But the IBM PC closed out 1982 having clearly established itself as a product not only to be catered to but emulated. All of the pieces were coming together for the emergence of industry-standard computing as we now know it.
The race was on.
1983: The IBM Standard Takes Over
Most major software developers were writing for the IBM PC by 1983, and the effect on the competition was staggering. Many companies that focused on earlier, non-IBM hardware and software standards eventually went out of business. It was fast becoming an IBM-centric world, and those who rode the wave won. The holdouts lost, with one exception: Apple Computer.
What helped the IBM standard stay on top was the innovative software development that rallied around it. Perhaps the most important early application for the PC was Lotus 1-2-3, the first in a series of integrated software packages that defined the industry. Combining a spreadsheet with graphics, 1-2-3 was more powerful than VisiCalc, the old standard, which was hobbled by a disagreement among VisiCalc's authors and publisher about new versions. Lotus 1-2-3 became the killer application for the IBM PC, just as VisiCalc had for the Apple II.
On Lotus's heels was a variety of other new IBM-compatible programs, such as word processors MultiMate and WordPerfect, and programs that made the leap from other platforms, such as Ashton-Tate's dBASE II.
On the hardware side, IBM gained even more attention with the introduction of the IBM PC-XT. At a list price of $4,995, it included 128K of RAM, a 10MB hard drive, a double-sided floppy disk drive, a serial port, and DOS 2.0.
Competitors finally conceded to the DOS standard and began offering compatible machines that were like the IBM PC, "only better." The DEC Rainbow offered compatibility not only with the 8088 but also with Z80 software. The AT&T 6300 and later the Texas Instruments Professional offered better graphics than the IBM PC.
But compatibility didn't come easily, as many computer companies soon learned, and they didn't survive. They simply couldn't run all the software that was written specifically for IBM hardware. The most popular applications, such as Lotus 1-2-3, MultiPlan, SuperCalc, and WordStar, were DOS applications but often were written to circumvent DOS and address the hardware directly so they could run faster. Many non-IBM PCs with differing hardware had trouble running these programs. And the burst of generic DOS programs being written didn't satisfy most users. They wanted real IBM compatibility: a guarantee that they would have a variety of software to choose from.
Despite the many failures among the new machines in 1983, one made significant strides in compatibility: the Compaq Portable. At a list price of $2,995, this AC-only machine included just one double-sided floppy disk drive (a second drive cost $595 more.) The Compaq Portable wasn't the first IBM-compatible or the first portable (Osborne Computer had achieved that in the CP/M world long before). It wasn't even the first compatible portable; other companies such as Colby, Columbia Data Systems, and Dynalogic had models that preceded it. But the Compaq unit gained a reputation as the most compatible of the new machines, and its success showed there was a market for portable versions of IBM-compatible systems, even though it weighed a hefty 28 pounds.
As early computer networks began to emerge, it became apparent that computers needed to work together. Thousands of people were using CompuServe and The Source to communicate, and e-mail and local area networks (LANs) were beginning to appear.
The typical PC computer of the day, however, was still a standalone computer. As we all know, it would be a while before e-mail and networking, especially Ethernet, would truly catch on.
1984: The Shape of Things to Come
By early 1984, IBM had firmly established itself as the corporate standard in computing. The unveiling of new PC models, including the significant IBM PC AT (for Advanced Technology), was one of two events that defined the year. The other was the introduction of a new platform from an old IBM rival that would change computing forever.
Compatible computers had begun to take on a larger role in the computer marketplace. Compaq branched out from portables, introducing its first desktop PC, the Deskpro. And IBM finally introduced the 5155 Portable PC, which resembled the original Compaq Portable. But this wasn't enough to blunt the growing acceptance of PC clones—portables and desktops.
IBM battled back for total dominance by introducing two models that defined opposite ends of the PC spectrum. It spent much of the year touting the IBM PCjr, which had been announced at the end of 1983 as a competitor to the Apple II and Commodore 64, but with an important advantage: It could run PC software. Ranging in price from $700 to about $1,300, this 8088-based home computer was expensive compared with other home PCs but still no match for business computers. It became infamously known for its wireless keyboard with Chiclets-style keys. Great expectations surrounded the machine, but it flopped.
IBM's introduction of the PC AT was a different story. Based on the Intel 80286 processor, the PC AT cost nearly $4,000 and included 256K of RAM but no hard drive or monitor. Models with a 20MB hard drive sold for $6,000.
The PC AT was significant, because it moved the industry to the next processor level while maintaining compatibility with almost all the original PC applications. Also, several important standards debuted with the PC AT. It was the first machine to offer the 16-bit expansion bus, eventually called the ISA (Industry Standard Architecture) bus, which lasted as a standard for more than a decade. The PC AT introduced EGA graphics, which supported 640-by-350 resolution in as many as 16 colors.
At the same time, IBM and Microsoft introduced DOS 3.0, which would remain the standard for many years.
Changes were afoot in other areas of the computing world, too. Hewlett-Packard introduced the first laser printer, an invention that would prove significant farther down the road; dot-matrix and daisy wheel printers continued to dominate the market for years.
But the big show-stopping announcement of the year came from Apple. And what better show to stop than the Super Bowl? Apple introduced its groundbreaking Macintosh in a TV commercial that it aired only once—during the 1984 Super Bowl. You may remember it: A young runner sprints through a faceless crowd, then throws a sledgehammer at a screen image of Big Brother, shattering the glass. "Macintosh," the ad said. "So 1984 won't be like 1984."
Both the commercial and the product left an indelible mark on the computing industry. The Apple Macintosh was the first computer to prove that users would accept more than a DOS prompt and character-based interface. The Mac's combination of multiple windows, pull-down menus, and a mouse wasn't a first; the Xerox Alto had similar features years before, and Apple had tried and failed with these features earlier in the Lisa. But all these machines were much more expensive than comparable PCs. The Macintosh wasn't.
The effects of the Macintosh on the future of computing would not be understood, however, for years to come, especially by DOS users.
The Macintosh was appealing, but it wasn't compatible with Apple's existing hardware or software, and it initially didn't have the applications Apple wanted. The Mac wasn't expandable, and it looked a little like a toy. The early versions didn't include hard drives. A typical reaction was shown on a 1985 PC Magazine cover: The Mac is shown saying "hello" on the screen, as it did in early Apple ads. An IBM is shown responding, "get lost."
But this scenario wasn't really the case. Just like the screen image destroyed by that sledgehammer, the image of the DOS-based, IBM-compatible microcomputer was about to be shattered as well. Apple pointed the way to graphical user interfaces, and the PC followed its lead. But it took a circuitous route.
1985-86: The Interface Race
The Apple Macintosh made most of the computer industry realize that the graphical user interface was inevitable, though it wouldn't become a major factor for most people for a long time to come. Nevertheless, it immediately triggered a race among PC companies to create the user interface of the future.
One of the first new attempts was IBM's TopView. Though it was only character-based, the interface allowed multiple programs to run on-screen at once. But TopView ultimately failed because of its compatibility problems; developers often needed to adapt their programs to work on it. And by this time, users had realized that compatibility was the key to "standard" computing, so they stayed away.
Microsoft Windows would later triumph, but in those days, the OS was merely one more contender. Microsoft described Windows 1.0 early on as a window manager and a graphic display interface. It looked like the early version of Microsoft Word for DOS, with a single list of commands on the bottom of the screen, instead of several pull-down menus. The windows could not overlap but instead stacked as tiles. Still, the basics were there, including a mouse used for selecting menu items, cut-and-paste capability, and a list of 30 hardware companies that would support it. Most companies that made DOS-compatible hardware were on that list, with one notable exception: IBM.
Windows also required a bitmapped display (which was not yet a standard), 192K of RAM, and two floppy disk drives. No hard drive was required.
By the time Windows finally shipped in 1985, two years after it had been announced, its recommended requirements included a hard drive and 512K of RAM. It had evolved to include pull-down menus as well as the Windows Write and Windows Paint applications. But it still didn't take off, mainly because there weren't many Windows applications available, although the first PC version of the desktop publishing system Adobe PageMaker followed in late 1986.
Other attempts at a graphical user interface included Digital Research's GEM, which became best known as the environment used by desktop publishing system Ventura Publisher. VisiCorp, publisher of the VisiCalc spreadsheet, had one last try with an interface called VisiOn, which didn't gain much support.
Ultimately, the most successful attempt of the era may have come from Quarterdeck, a small start-up based in Santa Monica, California. In 1983, Quarterdeck proposed a complicated integrating environment manager called DESQ. In 1985, the program was reintroduced in a stripped-down version, called DESQview, that was compatible with TopView. Many power users ended up running that for years as their multitasking windowing system.
Interface development wasn't the only area making headway during this period; technology for PC networks and communications was advancing as well. Novell had introduced NetWare in 1983, which was becoming a corporate standard for PC networks. Competition grew among network designs: Ethernet, which was invented in the late 1970s, was just beginning to receive corporate acceptance. IBM was fighting back with its own unique standard, Token Ring, which came out in late 1985. Slowly, applications that took advantage of networking were becoming more popular. For dial-up networking, 2,400-bps modems were taking off. And PC Magazine had its first start in the online world, unleashing the Interactive Reader Service bulletin board in July 1985. The response to the bulletin board was overwhelming. In fact, demand for one of PC Magazine's famous utility programs caused a phone shutdown in southern California; readers had dialed in to download the program after reading about it in an issue.
On the hardware front, Intel introduced the 16-MHz 80386 processor in 1985, although it didn't make its way into IBM-compatible systems immediately. Perhaps companies were waiting for IBM to make the first move, but IBM had other plans. So Advanced Logic Research (ALR) and Compaq made the leap by introducing the first 386-based PCs in September 1986.
The leap paid off. Compaq's 386 entry defined a new standard for the industry. Suddenly other computer companies were giving IBM a run for its money. "IBM computing" was no longer the catch phrase of the era; it was "PC compatible." To mark this milestone, PC Magazine changed its tagline from "The Independent Guide to IBM Personal Computers" to "The Independent Guide to IBM-Standard Personal Computing."
1987-1989: IBM Strikes Back
By 1987, with so many computer companies entering the market, compatibility was no longer being questioned: It was assumed.
Compaq led the pack of PC-compatible manufacturers, but many other choices were on the market, including AST and Toshiba. Toshiba popularized laptops with the clamshell design used in today's portable computers. Large companies such as NEC and Texas Instruments also entered the market. So did start-ups that typically sold computers via 800 numbers, such as Northgate, ZEOS, and PCs Limited, a small Texas company started in 1984 by an undergraduate student named Michael Dell. The company eventually would become Dell Computer Corp.
All of these companies were able to flourish, in part because they could buy DOS from Microsoft and chips from Intel, then use standard boards that would plug into the buses IBM had introduced with the PC and PC AT models.
For years, rumors had circulated that IBM had something new up its sleeve. In April 1987, IBM finally unveiled its most ambitious offering to date: a complete line of new computers and a new operating system strategy.
The range of the new line, the IBM Personal System/2, or PS/2, is startling in retrospect. It started with the entry-level Model 30, which had an 8-MHz 8086 Intel processor and two 3.5-inch floppy disk drives. Next were the Models 50 and 60, both with 10-MHz 286 processors. Finally, the Model 80 was IBM's first 386-based machine, offered with a 16-Mhz or 20-MHz 386, depending on the configuration.
For many reasons, the PS/2 line was the talk of the industry. (To demonstrate this, PC Magazine playfully changed its logo to PS Magazine for its issue of July 1987.) Instead of the standard EGA graphics, these machines introduced the video graphics array (VGA) with 640-by-480 resolution, which remained the standard for many years. Another standard from the PS/2 was the 3.5-inch floppy disk drive. Macintosh had introduced this drive a few years earlier, but the PS/2 was the first IBM machine to feature it.
The most controversial feature was a new bus, called Micro Channel Architecture, for add-in cards. MCA was faster and easier to configure than the expansion bus introduced by the IBM PC AT, and it improved the ability of multiple cards to run at the same time. But it was not compatible with anything that came before it; IBM wanted to regain the technical lead, even if that meant abandoning hardware compatibility.
IBM offered to license the new bus to other computer manufacturers, but it charged significantly more for MCA than for the previous PC AT design. Not surprisingly, the companies balked.
In 1988, AST, Compaq, and others produced an alternative: the Enhanced Industry Standard Architecture (EISA), which offered a 32-bit-wide data path but was still compatible with older 16-bit PC AT boards. The EISA and Micro Channel expansion buses competed for years, but most computer makers kept selling the ISA bus.
The controversy over Micro Channel was nothing compared with the one triggered by a new operating system, which IBM announced alongside its partner in the project, Microsoft. Named OS/2, the system was a graphical, multitasking environment that would take advantage of the 286 processor.
Predictably, the OS/2 marriage between IBM and Microsoft was troubled from the beginning. The first version of OS/2 to ship had only a character-based interface; the graphical version was delayed. On top of that, IBM and Microsoft seemed to be devising independent schemes with OS/2. Separate from Microsoft, IBM added to OS/2 a series of extensions for database and communications functions and called the package the Extended Edition.
Meanwhile, Microsoft marketed a new version, Windows 2.0, which still ran on top of DOS but was positioned as a transition to OS/2. Windows 2.0 added features such as overlapping windows, the ability to resize windows, and keyboard accelerators—or shortcut keys. It worked only in 8088/8086–compatible real mode, not the 286's more sophisticated, protected mode. But it was soon followed by Windows/286 and Windows/386, the latter of which added multitasking capabilities, the ability to run applications in virtual machines, and support for up to 16MB of memory.
Perhaps because of the confusion, neither Windows nor OS/2 gained much traction in the intervening years. Rather, DOS applications continued to reign, just as ISA boards withstood the challenges of EISA and Micro Channel. IBM had tried to regain control over the industry, but instead the PC standard proved it had a life of its own.
1990-1992: Windows Wakes Up
IBM and Microsoft still claimed to be working together in 1990, even as it became increasingly clear that they were positioning themselves more as competitors than partners.
In May of that year, Microsoft announced Windows 3.0 on its own, a system that would earn all of the momentum that was expected for OS/2. Windows 3.0 still ran on top of DOS, so it could run all the DOS applications. But it also took advantage of the 386 processor, so it could run multiple DOS applications and new Windows programs at once. More important, Windows applications were finally beginning to appear in some numbers, notably two of the first Windows word processors—Microsoft Word for Windows and Samna's Ami (which eventually became Lotus WordPro.)
IBM and Microsoft were still describing OS/2 as the eventual operating system of the future. In particular, both were promoting the upcoming OS/2 2.0, which was designed for the 386 processor and added a graphical user interface. But though IBM wanted developers to write for OS/2 immediately, Microsoft urged developers to write for Windows first, so it would be easier to port to OS/2. The growing discomfort between the two sides came to a head in early 1991, when they finally agreed to disagree and parted ways.
OS/2 2.0 appeared in 1992 and developed a strong niche in some large corporate applications. It was seen as the more robust, more stable operating system, and IBM's server version also gained some traction at the time. Later, IBM would make one last attempt to make OS/2 mainstream with its more consumer-friendly OS/2 Warp 3.0, which shipped in late 1994. It would sell millions of copies but not slow down the industry's broad move to Windows.
Windows 3.1 was released in 1992, adding better application integration, the drag-and-drop function, and greater stability. Through the early 1990s this became the dominant standard for PC applications, and Microsoft took a leadership role in defining multimedia specifications. It even began setting the specifications for future hardware.
Suddenly Microsoft was everywhere. Its Visual Basic and Visual C++ overcame big competition from Borland Software Corp. to dominate programming languages. And Microsoft's applications—led by its Office suite of Word, Excel, PowerPoint, and later Access—took the lion's share of the market, defeating Lotus SmartSuite and Borland Office, which continues today as Corel WordPerfect Office.
Microsoft had begun talking about its New Technology (NT) code base (originally known as OS/2 3.0, not to be confused with IBM's later product of the same name). At the time, NT was being promoted as an operating system that could run OS/2, POSIX, and Windows applications, as well as adding things like multiprocessor support and the ability to run on platforms such as Alpha, PowerPC and MIPS, not just the Intel-compatible family.
By the time NT shipped, it didn't support graphical OS/2 applications. Though it initially supported multiple processors, its market was clearly for servers and workstations running Intel and Intel-compatible processors.
Meanwhile, quietly in the background, the Apple Macintosh line continued to grow and expand and found niches in graphic arts, multimedia, and education. But in most corporate and government offices, the primary business system was one that followed the standards of the original PC. Service and support became of prime importance to PC users, so much so that PC Magazine ran its first Service and Reliability issue in September 1990.
By then the term "IBM compatible" had fallen out of favor; it was the processor number that became the primary descriptor of hardware. From 1990 to 1992, that processor number was 486. With 1.2 million transistors, the Intel 486 was effectively a faster, more refined version of the 386, but with an integrated math coprocessor. And it ran all of the applications written for the 386 without a hitch.
This time around, no one waited for IBM or Compaq to go first. Dozens of companies raced to have 486 machines available, and these machines could run certain CPU-intensive applications 50 times as fast as the original IBM PC. The same thing happened when Intel introduced the powerful Pentium processor in 1993. By that point, virtually all of them were designed to run Windows.
1993-1994: Everyone Online
Though new operating systems and processors received most of the market's attention in the early nineties, the biggest seeds of change were in the area of networking.
Networks had been around for years, including online services such as CompuServe and The Source. By the early nineties, America Online, CompuServe, and Prodigy were competing to gain customers. America Online promoted ease of use, and CompuServe promoted a more technically detailed set of discussion boards, including PC Magazine's PC MagNet (later ZiffNet). These online services were proprietary and had unique content and unique communications programs.
Change was coming from an unexpected source, however: the Internet. At that time, the Net was really just a collection of different discrete networks linked by a standard communications protocol called TCP/IP. The Internet dates back to 1969, when the first two remote computers were connected to create the nascent ARPAnet, funded by the U.S. Defense Advanced Research Projects Agency (DARPA). By 1972, that network had been used to send electronic mail from one place to another. As the Internet continued to grow, the government passed control of it to the individual sites and technical committees.
All of the online services were working on ways to exchange e-mail using the then-fledging Internet. Meanwhile, more people were using the Internet, primarily through services such as Gopher (for searching for information)and FTP (for file transfers).
In 1990, a British researcher at the CERN particle physics lab in Switzerland named Tim Berners-Lee had created a program that he hoped would allow researchers to work together on a project, post information about that work, and link to fellow researchers' posted information with ease. He called his program WorldWideWeb. As part of this program, Berners-Lee created most of the Internet standards we now take for granted, including the first HyperText Markup Language (HTML), Hypertext Transfer Protocol (HTTP), and Uniform Resource Locator (URL), the familiar Internet address.
All of this is important in retrospect, but it didn't get much attention at the time, aside from a small circle of researchers. An invention by a group of graduate students changed all of that. In the fall of 1993, students from the National Center for Supercomputing Applications (NCSA) at the University of Illinois created the first graphical Web browser, which let everyday people read World Wide Web applications more easily. Led by Marc Andreessen and Eric Bina, the students called their browser Mosaic.
In 1994, Web browsers moved to the next level when Andreessen and his team received funding from Silicon Graphics founder Jim Clark to create what eventually became Netscape Communications Corp. Netscape Navigator added many new features, including support for plug-ins, which in turn led to multimedia extensions for everything from streaming music to flash animations. Eventually Netscape Navigator was able to run applications created in Sun Microsystems' portable Java environment, making it almost an operating environment unto itself.
Suddenly everyone seemed to be getting on the Web. PC Magazine urged readers to "make the Internet connection" throughout 1994.
In the business world, corporate networks began to proliferate and grow more powerful. The Compaq Systempro, which was introduced in 1989, had paved the way for a generation of industry-standard servers. Novell NetWare battled upstarts such as IBM's LAN Server and Microsoft LAN Manager—and later NT 3.1—for network operating–system prominence. Groupware applications proliferated, including Lotus Notes, which was introduced in 1989. This trend gained speed after IBM acquired Lotus in 1995.
Of course, traditional PC hardware and operating systems continued to evolve. By the time Intel introduced its 60-MHz Pentium Processor in March of 1993, processor speed had become the biggest element used to differentiate hardware. Intel began to face more competition in the x86-compatible arena, as AMD and Cyrix began to issue their own chip designs.
Hard drives continued to get bigger and faster. Graphics display technology progressed with graphics accelerators, which worked directly with Windows to increase screen response times and to enhance all graphics. And two competing bus standards, first VESA and then PCI, gave computers peripheral buses that were 32 bits wide and ran at 33 MHz, compared with ISA's 16 bits and 8 MHz.
But the changes in software and connections were what had most people talking. That revolution had just begun.
1995: Windows 95 and Multimedia
By 1995, 32-bit processors had been a major part of the PC landscape for some time. The Motorola 68000 that powered the original Macintosh back in 1984 was a 32-bit processor, and Intel had moved the PC-compatible world into the 32-bit era with the introduction of the 386 in 1985.
But ten years later, most PC users were still running 16-bit applications such as DOS, OS/2, and Windows. It was high time a 32-bit operating system became the standard.
Windows 95 proved to be the operating system that would fill that need. IBM and Microsoft had independently talked about 32-bit OSs for years. Indeed, at one point, Microsoft had promised one would be out by 1992. Instead, the industry kept waiting for the system, code-named Chicago, for years. IBM tried to use the delays to push its latest version of OS/2, which it called Warp, but without much success in gaining the critical mass of applications a new operating system needs.
Following what may have been the largest, most successful marketing campaign in the PC business, Microsoft finally introduced Windows 95 in August 1995. It quickly became the standard for end user computing. Windows 95 allowed for full 32-bit applications and supported Plug and Play, preemptive multitasking, and a variety of new e-mail and communications protocols. It also included the basics of the user interface Windows uses to this day, such as a Start menu and an Explorer window with folders and icons. PC Magazine did extensive testing of Windows 95 and OS/2 Warp, because they were the two main systems to compare. From 1996 on, however, the comparisons to be made were between new versions of Windows and old ones.
Windows 95 was a change in the way many people viewed operating systems; they realized OSs weren't about loading individual applications but about accessing data easily, a concept Microsoft's Bill Gates had been talking about since 1990.
Microsoft would follow Windows 95 less than a year later with Windows NT 4.0, which incorporated the same user interface and ran most of the same applications using the Win32 programming interfaces. Corporate IT managers embraced Windows NT 4.0's more stable design. Together these two systems became the standards for computing for the rest of the nineties.
Other companies had similar ideas in this period. In the early nineties, Apple and IBM were supporting a joint venture called Taligent. This project resulted in a fairly extensive series of frameworks for developers to use in creating object-based applications, but it was too complex for typical users. Steve Jobs' NeXT Computer was promoting an object-oriented OS, which targeted corporate customers. New systems such as Java and Linux were beginning to get some attention, but their success would be in the future. Instead, Microsoft Windows would solidify its dominance of the computing world.
That world was also becoming more engaging as multimedia made its way into everyday computing. Almost from their beginning, computers could play audio files and display graphics, but the early capabilities were limited. By the early nineties, PCs included decent sound systems and graphics. All this was helped along tremendously by the appearance of the CD-ROM drive.
With a CD-ROM drive, developers could create programs with up to 660MB of sound and video clips. Initial applications included Compton's encyclopedias and soon after, Microsoft Encarta, as well as a series of CD-ROMs that were somewhat the equivalent of coffee-table books, such as Microsoft Musical Instruments.
In the processor world, speed continued to increase while prices declined. AMD and, for a while, Cyrix, emerged with chips that finally began to compete with Intel's. Graphics chips got faster as well, with companies like ATI Technologies, Matrox Graphics, and S3 pushing the envelope of delivering more frames per second. The combination of faster processors, the Win32 standard, and CD-ROMs created a growth spurt in PC gaming, led by titles such as Doom, Myst, and Quake, which offered much more realistic graphics and game play than their predecessors.
The business world was slower to adopt multimedia. Still, things such as Microsoft PowerPoint presentations became commonplace, and audio speakers—long shunned by IT departments as not businesslike, became a standard part of the PC platform.
PC Magazine devoted several covers to the long anticipated Windows 95.
1996-99: The Dot-Com Revolution
Personal computing moved into high gear in the late nineties, fueled by the unstoppable Internet and World Wide Web.
A few early companies anticipated the growth and came up with Web sites that grew at amazing rates. A New York business consultant named Jeff Bezos had an idea for a site that would sell books and have no inventory—everything would be sent directly from a distributor. This, he reasoned, would give him a larger selection of books than any physical bookstore and would allow him to sell at a discount. The result was Amazon.com.
A couple of Stanford students—David Filo and Jerry Yang—decided there were so many sites that the Web needed a directory to organize them. Their solution was Yahoo!, the first major portal.
Pierre Omidyar based his Web site idea on his girlfriend's hobby of collecting PEZ dispensers. People could buy and sell their collectibles and other goods at the site, which became eBay.
And the list goes on. Suddenly every company needed an electronic storefront. And for a while, it seemed as though any college student with an idea could get funding for a dot-com, go public, and make millions. The stock market moved up at a breathtaking pace. People began talking about a "new economy" based on startups and fast-moving companies.
America Online wanted a piece of the Internet action. It quickly changed its model from a proprietary service to an Internet service provider, while keeping many of its features, such as private chat rooms and an easier user interface.
Just about every hardware manufacturer adapted its products for the Web. Cisco Systems, a long-time maker of networking equipment, became the dominant supplier of routers, switches, and other things that made the Internet work.
Sun Microsystems, originally a maker of engineering workstations, grew to become one of the largest providers of server hardware for big Web sites. It also put its muscle behind Java, a variant of C++ that a team of Sun developers had designed for use in set-top boxes. The Java language was later used as the foundation for a virtual machine that allowed Web developers to write Java applets, which ran within browsers. For a long time, Java was promoted as a method for writing applications that would work on any operating system, and it eventually became particularly popular in J2EE, a version that worked on large servers.
Several providers, notably BEA Systems and IBM (through its WebSphere product) provided application servers that would become the heart of the e-business initiatives of many companies.
Big database providers, such as Oracle, also provided a key part of the Web infrastructure, as most big Web sites became dependent on large amounts of data.
Microsoft, which had been focusing on Windows instead of the Web, suddenly did an about-face. In an early 1996 issue, PC Magazine devoted its cover and entire First Looks section to the Web war between Microsoft Internet Explorer and Netscape Navigator. Microsoft's bundling of Explorer with Windows 98 and later operating systems led to a long-running antitrust lawsuit battle brought on by the US Department of Justice. Microsoft also began to enter other parts of the Internet market with its MSN Web sites and online service, which became competitors of AOL.
The biggest threat to Microsoft's dominance appeared to come from all the small companies that had materialized on the Internet. In the early 1990s, a college student in Finland named Linus Torvalds was working on a small, lightweight Unix-like operating system called Linux, which would run on Intel-platform machines. As the Internet took off, many Web sites chose to run on Linux or other open-source variations—instead of Windows NT—in part because they were less expensive or free, but also because developers had access to the code and could specify which portions they wanted to run. Many small Web sites used an open-source Web server called Apache.
It was an era with a lot of growth and experimentation. Everyone was talking about Internet appliances, broadband connections, and new Web sites. By the end of 1999, half of all Americans were connected to the Internet, and PC Magazine marked the occasion by changing its tag line once again: "The Independent Guide to Personal Computing and the Internet."
2000-2002: Back to Basics
As a new decade began, a new sense of realism entered the computer industry. The last few years of the nineties saw much work to solve the Y2K problem—a belief that many computers wouldn't work properly when the year switched from 1999 to 2000. But January 1, 2000 came and went without much of a glitch.
The glitch—and a colossal one at that—would come later. Between 2000 and 2001, Wall Street realized that companies needed business models that made profits; just being a Web site was not enough. The dot-com boom crumbled into a dot-com bust, as many high-flying Internet companies ran into trouble and closed down. Working for a dot-com had suddenly lost its cachet, and would-be millionaires were suddenly pounding the pavement in search of their next jobs.
The economy is still licking its wounds, yet the online world continues to encourage us. A number of large pure-play Internet companies have survived and prospered, and many traditional businesses turned into successful clicks-and-mortar businesses with both online and physical storefronts.
But every year, still more people go online, and their hours online are projected to increase as broadband connections become more commonplace. Internet technologies have created e-business processes, with tools such as customer relationship management (CRM) and supply chain management becoming more integrated with business, particularly for larger firms.
Of course, computer hardware continues to get faster, more powerful, and less expensive every year. On the chip front, for standard desktop applications, the AMD Athlon and Athlon XP chips have given the Intel Pentium 4 line a run for its money. Both companies continue to race to produce new chips, with Intel breaking the 2-GHz barrier in late 2001.
Though 32-bit computing has dominated the PC landscape for well over a decade, we're beginning to see signs that a 64-bit transition is coming. Intel released its first 64-bit PC chip, the Itanium, in 2001, and over the next few months, we expect to see new versions of it as well as the AMD Hammer chip family.
In computer software, Microsoft's release of Windows 2000 provided a more stable operating system. In October 2001, the company released Windows XP, which essentially moved the 2000 core to the consumer space, finally providing the unification of Windows 95, 98, NT 4.0, and 2000 operating systems.
The PC business has continued to consolidate. Dell has come to dominate direct sales of PCs and in recent years has become by far the most profitable PC manufacturer, while Compaq, Hewlett-Packard, and Sony have come to dominate retail sales. IBM remains in the industry it created but has exited the retail business, selling instead mainly through its direct sales force and online. Last year, Compaq and HP announced plans to merge, further consolidating the market. But there are still many machines sold by smaller companies or put together by local resellers.
Apple continues to innovate, of course, remaining the one big personal computing company to shun the Windows standard and focus instead on its own operating system and designs. Apple has stood out recently for its unique industrial design, ranging from the colorful iMac line to the sleek Titanium notebooks.
Because office applications have become second nature, the PC industry has shifted its focus to Internet access, new entertainment options, Web-based services, and the integration of many new peripherals and applications.
Personal digital assistants (PDAs) are versatile and powerful, and they're beginning to integrate things such as wireless data. Digital cameras and camcorders have become a hot market, as have portable music devices. The technology market has broadened so far beyond PCs these past two years that in February of 2000, PC Magazine's new tagline served as a sign of the times: "The Independent Guide to Technology."
Indeed, over the past 20 years, the PC has fascinated us and constantly surprised us as it has evolved. It began as a standalone device, performing basic functions; now it connects to other computers via the Internet and has become the heart of both business applications and the digital transformation of everything from photography to video to music.
It has enriched our business lives and our personal lives. We just can't wait to see what will happen next.
In the world of Linux, we have some major improvements from Mandrake and others and a more recent entry from a company called Linspire (was Lindows until Gates successfully sued them). Check out www.Linspire.com
Above material Copyright (c) 2001 Ziff Davis Media Inc. All Rights Reserved.
Note: this site is provided free. Donations are accepted to help support the
work. Click here for instructions.
© 2000-2010, FMS All Rights Reserved.
All original material
on this site is copyrighted.
Copyrighted material from other sources and/or trademarked material is the property of the respective owner.
If anyone feels any material on this site violates these rights, please E-Mail me using the link above