View From the Top
September 4, 2001
By  Michael J. Miller

In the 20 years since IBM introduced the PC, many technology leaders have contributed greatly to the success of personal computing. But a few have risen above the pack and changed the course of history through vision, invention, and innovation.

Initially, the face of the IBM PC was maverick team leader Philip "Don" Estridge. He convinced his bosses to organize the small team that created the Acorn, the de minimus code name for the IBM PC (a machine that was expected to sell a mere 240,000 units over five years). But by the time Estridge died in a plane crash just a few years later, millions of IBM PCs had already been sold.

The machine's prominence created multi-billion-dollar markets for two start-up companies: Microsoft and Intel (the duo known as "Wintel"), both of which continue to drive the development of PC software and hardware.

In a 1982 interview, Bill Gates told PC Magazine that when IBM first met with him in July 1980 to talk about Microsoft's tentative role in the "IBM project," IBM executives said, "Don't get too excited, and don't think anything big is going to happen." But they were obviously wrong. Microsoft became the first PC software company—one that eventually would dominate the operating-system market for desktop and laptop PCs and set the bar for how we interact with them. Its Office productivity line has taken control of the word processing, spreadsheet, and presentation markets; the company is also a central player in online services, programming, databases, and gaming.

Meanwhile, under the direction of CEO Craig Barrett, Intel provides the majority of processors for today's PCs, delivering power far beyond what the original chip offered. From its 4.77-MHz 8088 beginning to the Pentium 4, with clock speeds up to 1.7 GHz, Intel continues to stretch the limits of technology. Odds are, within another 18 months we'll see machines with clock speeds almost 1,000 times that of the original PC.

And if anyone embodies the anti-Wintel computing philosophy, it's Scott McNealy, the longtime chairman and CEO of Sun Microsystems. McNealy, along with Bill Joy, Andreas Bechtolsheim, and Vinod Khosla, founded Sun (which stands for Stanford University Network) in 1982. Their goal was to redefine the workstation, from a high-end, highly specialized machine suited to complex scientific and engineering applications to a less expensive box that could run mainstream applications. Along the way, Sun also developed its own OS (a version of Unix called Solaris), Java, and the Sun ONE software platform.

We used the 20th anniversary of the PC as an opportunity to catch up with Gates, McNealy, and Barrett, giving them a chance to look back at what's been accomplished—and what is yet to come.


Michael J. Miller and Bill Gates: Uncut
September 4, 2001
By  Michael J. Miller

OnlineExtraOnline Extra

From the time he started programming at age 13 through the many years he's spent outwitting competitors, William H. Gates III, chairman and chief software architect of Microsoft Corp., has never swayed from his guiding belief: The personal computer improves our lives and belongs on every desktop and in every home.

With a strong desire to exploit the full potential of the Net and to simplify the computing experience, Gates foresees continued, unlimited possibilities for the PC. To that end, Microsoft invests more than $4 billion on research and development annually. "People are going to understand the idea of the individual machine and the empowerment that comes with it," says Gates. "We are not even close to finishing the basic dream of what the PC can be."

Michael J. Miller: In retrospect, what have been the most significant milestones since the introduction of the PC?

William H. Gates III: Until we get to the Internet in the nineties, there are three key milestones: the start of personal computing (with the IBM PC), the graphical user interface, and the move to the 16-bit processor. When the PC was launched, people knew it was important because it was [from] IBM. It's not easy to remember, but IBM was the computer industry when I was growing up. You loved 'em. You hated 'em. You knew what they were doing. They had set a standard for mainframes. They also set a standard for great sales focus and heavy product R & D.

Before the IBM PC came out, a lot of the small computer companies were saying, "Okay, this is the overthrow of IBM." In fact, there's a famous Ted Nelson speech from one of the West Coast computer fairs where he talked about this great struggle among the large organization, central computing—that's IBM—and all of individuals like us with the freedom to communicate and do new things. Well, actually I agree with that completely, but the fact is that IBM wasn't necessarily just associated with the big central computers. They proved this through a very unusual series of events, which the first issue [of PC Magazine] documented quite well. There was an interview with me where we talked about this incredible history of how the PC really grew out of an IBM project. This project was for IBM to prove it could do something quickly, rather than some deep and enduring belief in the PC.

The project spec [for the PC] was supposed to be far lower-end than what we on the IBM team made it to be. We weren't supposed to put a disk on it; but later, of course, we overlapped this gigantic IBM division, which included both the word-processing division and the small-business computer division.

People were stunned that the guy who was running the PC effort, Don Estridge, was in charge of those other two divisions. His product charter became so broad that his small group even though his group was small, and its approach—openness, fast movement, using third parties like Intel and Microsoft, and the retail channel—took over the DisplayWriter [word processor] and all of the very low-end small-business things.

The IBM PC was a huge milestone, including a model where the hardware competes separately from the software. The fundamental advantage of that model—and what it's meant for the PC industry—is still not recognized by people who don't realize how critical that was. We now have an industry that passes along price decreases, performance improvements, and software, which makes it even more valuable.

Another significant factor, though, was the idea that machines could be made truly compatible with each other. People may not remember, but prior to the IBM PC, you couldn't run the same software on different machines. Even in the first two or three years after the IBM PC was out, 100-percent compatibility didn't exist; there were so-called "degrees" of compatibility. Even the idea that the disk format should be the same was a breakthrough. There was one company, Lifeboat, whose sole function was to offer software for the 15 different disk formats that PCs had at the time. With the IBM PC, hardware companies could say, "Hey, make the same thing. Make it compatible." Out of this came a form of competition—in terms of performance and price—that really drove the industry.

Perhaps more important, we could go to software companies and say, "Look, the volume opportunity of targeting this one design is incredibly high." And so you could afford to write more software than you did before. People don't realize there was hardly a software industry before the IBM PC came around. There was a little bit of mainframe software and a little bit of personal-computer software, but these industries weren't a tenth of the size that the software industry is today.

The competition to create great software for the PC meant that in the mainstream categories, you could put unbelievable R & D into word processing, spreadsheets, and so on. And categories that wouldn't have been economical before—for instance, software for dentists—all of a sudden became economical, as did educational and gaming software. It really is a key point in the history of the industry.

Michael: And what about the Internet revolution?

Bill: I think there certainly was a milestone in the nineties with regards to the Internet achieving critical mass. There were several magical factors that came together: the creation of HTML by Tim Berners-Lee, the drop in the price of communications, and all the PCs out there that you could put this software into. We in the industry had been talking for decades about getting all these things connected. And we were actually kind of depressed that it didn't [immediately] flower into this big thing. It happened a little bit inside companies, with e-mail and file sharing, but it just wasn't happening between companies.

Even the online services were fairly small at the time. The Internet concept—which, except for HTML, had been there for over 20 years—essentially came out of the university environment. It was funded by DARPA, and became the standard. All of a sudden, you had one of those critical-mass events where everybody has to have a Web site. It created a new way of information sharing and a new way of thinking about applications. There's a fundamental milestone there, and people say it happened in 1994, 1995, or 1996. Somewhere in that time period, the magic happened, and that's enduring.

Every one of these revolutions builds on the one before, so we still have the benefit of the graphical user interface and the hardware compatibility model. Now we have the Internet.

I would claim that we're on the verge of another milestone: one that involves thinking of an online network as a programmable network: Instead of just talking to a browser, you have programs talking to programs, whether it's server-to-server or server-to-client. Part of this is peer-to-peer, but that's just one part of it. Another key part is XML, in which you can have data exchanged between systems that weren't designed by a single team. E-commerce is sort of the prototypical example, where you can buy from people without even seeing them. You can find a seller, negotiate complex issues, and do a transaction. So I'd say there's an Internet milestone in the mid-nineties and the XML/Web services milestone in 2001.

Michael: Over the past five years, people have been talking about devices that will replace the PC or serve as adjuncts to the PC. Certainly the growth in PC sales (particularly in the United States) has slowed considerably. What do you think of these other devices? And what kind of role will the PC—as a device—play in the next 20 years?

Bill: There's a perfectly healthy debate that probably will never end about the role of these different devices. The full-screen, general-purpose device—that's the PC in my view—will always play a central role. When you want to do your homework, fill out your tax return, or see all the choices for a trip you want to take, you need a full-size screen. For a variety of tasks—budgeting, new types of interactive music, video conferencing, simulating things, playing game—the general-purpose nature of the device is extremely important. And so the fad of the last four years is to say these other devices are more important. To me, they're just complementary devices.

Sure, you want a device that fits in your pocket. And actually, Microsoft has invested more money in non-PC devices than any other company. Everybody who owns a PC will certainly own a mobile phone, and maybe they'll own more than one, but the tasks are quite different. You'd plan a trip on a PC, and then you might want to be notified of a flight change or other problem wherever you are; for that, you have a phone/PDA. You'd like to be notified, you'd like to be able to make a change, and you'd like your information to show up on both of those things. So there are some exciting new device categories.

However, we need commonality across all these devices. As an industry standard, XML should help with that. For Microsoft, we call it our .NET strategy, which embraces all these different devices using XML. But when you want to be creative, when you want to communicate, when you want to be entertained in the best way, you're going to want that full-screen PC device.

I think there is an irony that people are largely missing because of the dot-com bust, the economy not being that strong, and certain PC breakthroughs haven't taken place yet this year. They're missing the fact that it's really within this decade that the dreams we've had will come true.

The one idea that I think is the most radical and most underestimated is the idea of the PC as a reading device. That takes a full screen. Yes, some people read off of their Palms and Pocket PCs, but the real immersible reading experience takes a full-screen device. But it takes a full-screen device with better resolution than we have. You have to be able to hold it so that you can move your neck. You need a fixed position to read from, no matter how good the resolution is. There's just a limit to how much people are willing to do. I think that in this decade, the tablet-PC form factor will kick off, which will depend on some great advances in the electronics, wireless, and software aspects that deal with handwriting—a lot of things that we and others have been working on for a long, long time.

Then people are going to understand this idea of the individual machine and the empowerment that comes with it. We are not even close to finishing the basic dream of what the PC can be.

Michael: When you look back, what applications did you think would take off by now, but really haven't? Voice recognition, handwriting recognition? What things haven't happened, and why? And do you think that's going to change?

Bill: People are a little skeptical about all the natural-interaction stuff, because there's been predictions that we were on the verge of speech and handwriting recognition many, many times. It's appropriate that people are skeptical. But business is so overwhelmingly nondigital. Even businesses that are heralded as being Internet-centric. Believe me, there's a guy on a phone—somebody with a tablet and a pencil—saying, "Oh, okay, we've got a problem," and trying to deal with it. Videoconferencing is a great example. At the 1962 World's Fair, AT&T showed the wave of the future with the videophone.

The funny thing about me is that I actually believe that all of these things will happen. And I'd go so far as to say that it's really within this decade that we as an industry will finally get every one of these things to work—and work well. And I don't mean just for a few people; I mean on an extremely mainstream basis. Take the idea of recording a meeting digitally so that somebody can go back, search it, and play back snippets. The idea is that if you're in a meeting, you can take your notes there. I think the pieces will come together in this decade, and that's why we think of it as the digital decade.

Michael: How often do you use videoconferencing?

Bill: I do videoconferencing about three or four times a year.

Michael: Why so infrequently? The technology is here, to some extent.

Bill: Well, there're all these style changes that have to come about. You'd have it so that there were, say, three or four hours a week that people would reserve for doing quick videoconferencing. And it would have to be very easy to indicate whom you'd like to talk to, because if the overhead is super high, then you might as well just have a face-to-face meeting. And take electronic scheduling. We are really far away from the day when people will primarily use the computer to do their scheduling, because you've got your family schedule, work schedule, and the whole notion of who should be able to view what. It just hasn't come together and achieved critical mass.

A few things have. E-mail has achieved critical mass. Instant messaging—within a certain age group—has achieved critical mass. The digital camera attached to the PC is very cheap now. The microphone is actually more problematic. PCs don't come with good microphones, and it's not natural yet to use voice with a PC. So take simple things like voice annotation, which doesn't require brilliant recognition. When's the last time you got a document from somebody in which the annotations were done by voice? Because voice annotation is not mainstream, software does not support it as well as it should. Take e-mail: If you're responding with your PDA, wouldn't it be easier to respond with a voice note? But that's not mainstream yet. Part of the wonder of our business is that these things are ridiculous, stupid dreams, and then literally in a period of months, they become common; well, they're common at least among a certain avant-garde set of companies.

Voice messaging, digital scheduling—the software isn't good enough yet. And for us, it's nice to be involved in the element that's one of the key missing pieces. I'd say other than software advances, the only thing that I'm a little worried about in the years ahead is whether the cost of consumer broadband will come down enough so that we can really think of it as mainstream. Recently, the price of cable modems was increased from a level at which a lot of households could afford it, and now it costs somewhat more.

Michael: The speed of communications has been increasing, and the speed of processing has been improving. What else has to get better on the fundamental technology level?

Bill: Lots of things will get better. The size of the disk—those guys do their job very well; disks will get bigger. We actually need this new class of device called MEMS, or microelectromechanical systems. Those things are super cool. There's still some work to be done on them, but they do essentially provide zero boot time, or the size of the tablet PC, or even full PC-like capabilities on a PDA. Anyway, we need something like that. Hopefully it will be MEMS, because they could be out in the next couple of years.

We need network speeds to keep going up. There isn't much doubt about that. We need screen resolutions to be much higher. We need the cost of flat-panel color screens to be dramatically less, no matter which technology is used. Every kid in school should have a color tablet PC connected to a wireless network, and the price of that device should be well under a thousand dollars. What's the biggest thing holding that back? The screen cost is a big part of it. That's the gap between a desktop PC and a portable PC, but it's certainly going to happen.

We need processor speeds to keep going up. We'll also have to go through a transition from 32-bit to 64-bit, which is complex in terms of getting all the pieces together. This transition will start on the server but eventually be relevant even on the desktop.

Michael: How long is that transition going to take?

Bill: On the desktop, 64-bit is not critical for most applications. It's going to be a long time before 64-bit is pervasive on the desktop. AMD has an approach in which they put 64-bit into their mainstream processor, whereas Intel has taken the opportunity to switch to a different instruction set. Nobody would dispute that a very high percentage of servers, say, four years from now, will be 64-bit. But the percentage on the desktop, even four years from now? I'm not certain about that.

We've certainly done all the work for the 64-bit version of Windows, and it's not that hard to recompile the software. In some ways it's easier than 16-bit to 32-bit was, because then the (memory) addressing was different. But that's a long story. There are actually three different ways of doing addressing: the 8088 way of doing the addressing, which is a 4-bit shift and add; the 286 way of addressing, which is an indirect segment register; and then there's the nice 386 way, which is in a linear address base with page tables. Anyway, 64-bit is just an extension of the 386 way. I don't think users will get caught in that much pain on this one. It's not like the 16-bit thing where people really were banging up against the address base, and they got into this whole thing about extended memory and all these different schemes for trying to deal with the limits.

Here we'll get through the transition before a lot of people are sticking 4 gigabytes of physical memory on their PCs. I don't think we're going to be held back by anything except the lack of consumer broadband. It's true that the 3GB wireless networks are going to be more expensive and slower to release than people think. But there is 802.11 for the places you really spend time: home, work, convention centers, hotels, and airports, among other places. There are still some pieces to put together before wireless is really pervasive, and there are even certain interference issues. But I think all that stuff will be solved. And in so-called "hotspots," you'll have very good and extremely inexpensive digital wireless.

I think a lot of the onus sits on Microsoft and other software companies to make it so you're not reluctant to install an additional software application. Your overall experience can be simpler, and if there's ever a problem, the confusion should be greatly reduced. We have to use software technology and the Internet to do those things. One great start is what we do in Windows XP with support, where you can have somebody else connect to your machine, take control, and help you out that way, as opposed to sitting on the phone. That's just one step, but it's a valuable step.

Michael: It's a valuable step, but isn't the real problem when people install a new application; it causes something else to not work?

Bill: Yes, but let's say we could do the following: Instead of millions of people running into the same problem, after the first few hundred ran into it, you could update the pieces involved so that no additional people had the problem. We need to use the Internet to complete the feedback loop. We need to see the problems and see them early. It's almost the way people think of virus detection, where the virus makers feed the bits back out to fix problems.

Also, you have to rebuild constantly in computing. We had to rebuild Windows essentially from scratch to get the reliable kernel. And over the next couple of years, we'll be going through a transition in which the older kernel, which we call 9x in Windows 95, 98, and Me, will not be used on new machines. So we'll build up to a more reliable kernel.

Web browsing grew up separately from file browsing, which in turn grew up separately from e-mail browsing, which is separate from database browsing. So there are all these different concepts and utilities that relate to concepts on the PC. Clearly, sometime in the next decade (earlier rather than later), we need to take the radical step of saying, "Hey, we have too many concepts for people to learn." We've created enough user-interface constructs that we need to simplify and unify.

The dynamic since Windows first came along has been mostly to add things—the toolbar, the Word taskpane concept, and adding the assistant. Those are really good things, and it's easy to add things in. You do get some overlap, and you do get the large number of concepts. The screen gets full at some point, because there are only four sides to it. I think different concepts have used all four sides now: task bar here, menu bar there. Some Word pane stuff over here. Some stuff on the right side. Eventually, we have to declare a new style guide—a radically simplified style guide—and we have to make it sit on top of a piece of storage technology that unifies the different concepts. This is something Microsoft has been thinking and talking about for many years.

Michael: Right. You talked about it back when the first version of NT came out.

Bill: That's true. It was a project called Cairo. I'd say its importance is more clear now in database technologies. The way XML is being designed into the database is really letting us realize this integrated storage vision. So that's one nice thing about Microsoft: We can afford to invest in things way before they get to be mainstream. I mean, we've been investing in digital TV for more than a decade, and we have very little to show for it. But believe me, when the day comes, we'll have a lot of software ready. It's like the tablet and handwriting recognition. Unified storage—we're going to take a swing at that, too. We're getting the company lined up to make it a bet, just as Windows was a gamble.

Michael: In the last 20 years, there have been major changes in the way people compute, between CP/M or Apple and DOS, between DOS and Windows, between Windows 3x and 95. Lately the changes seem to be more incremental, at least from a user-interface point of view. And there are so many more users out there. Will we need another radical change?

Bill: Well, three things argue for radical change: the natural input technologies—speech and handwriting; the use of multimedia in the display of information—audio, video, and you could say 3-D fits in there; and a desire to simplify—to take things that grew up separately and pull them together into a common concept. Making a radical change is a big deal.

There's really been only one radical change, which was the change from the character-mode interface to the graphical interface. The user interface of DOS was a lot like big-machine time-sharing. You can look at the PDP-10 operating system and see that many of us—including Gary Killdall (who did CP/M), Paul Allen, and myself—clearly have used PDP-10 a lot and got that flavor into the whole command interface. So really, the graphics interface is the only time something radical happened.

Like any historical event, people forget how painful that was. We actually brought the GUI along at a point where it overtaxed both the memory and processor speed. We tried to work Windows on a 286 and even the early 386. All this incremental stuff came into play. For example, scalable fonts took a lot of work. People now take it for granted that fonts look good and scale into anything. There was even some controversy about the two font formats. Everything worked out in the end.

Some of those incremental things are a very big deal. One thing that's strange today is that we've got a browser where we just click on links, and we've got productivity software in which we get to create things and do things. And there's quite a boundary between them. Getting the neat things we like about the Internet but also the things we like about creativity and authoring and bringing those two things together makes a very interesting synthesis. And at Microsoft, we have lots of people who get to work on incubation projects that relate to radical user-interface changes.

I honestly believe that in the next three or four years, we will propose a radical user-interface change that draws on those different elements I talked about. Within the company, I'm known as a proponent of this. Many people say, "What's wrong with evolutionary?" These debates are all very fruitful, because there are good arguments on both sides.

Remember all the different things that came and went? There were so many things, like when everybody said everything had to be object-oriented. So there was this thing called HP Openwave, and people were saying that Windows wasn't object-oriented enough and it would just go away. They said this because the Openwave object-oriented approach was really big. The dead ends are not remembered. When the Next computer came along, people were saying, "Oh my God, this Next thing, it's just optical disks and gray scale." Some people were drawn in, but not everyone.

Michael: Well, the basic technologies are things that we've taken for granted: optical disks, CD-R, and others; we all use CDs.

Bill: No, we don't use optical disks for read/write, because they're too slow. They were too slow when the Next machine came out, and they're too slow today. These magnetic things—MEMS—are unbelievable. What they've done in terms of capacity and data rate is unbelievable. But they have problems. For example, startup time, or the idea that you want to walk up to your PC and have the information be there. We've got to keep the disk spinning to do that, but disk spin requires power. That's the one complaint we have about those guys—it just isn't zero startup time.

Michael: Technology has advanced to a point where databases can keep records of whatever you're doing and what other people are doing on your site. You have talked about occasionally recording meetings in the future, and so forth. I speak with a lot of people who have concerns about privacy and security—the Big Brother aspect.

Bill: This whole privacy thing is very complex. The guy who processes your credit card knows a lot about your behavior. Your checking account also contains a lot about your behavior. Then you have your tax return, your medical records, and so forth. There's enough information out there that people certainly feel a need to keep control over it.

Today we have convenience versus privacy. If you go to a Web site and tell them all your preferences, it will be incredibly convenient the next time you visit. But they take that information, correlate it with 12 other Web sites, and at least for some consumers, create too much of an invasive picture of their activities. We need to have technology that gives you convenience without having the various Web sites you visit retain that information. And you know that can be done. What it means is that you have to have a few places where you store your information—on your PC, or on a card that you carry, or on an Internet service that you trust and takes this issue seriously enough. So there are some technology pieces to this and then there are some policy pieces.

Let's say that somebody is considering employing you. What should they be able to find out about you? If they're hiring you to be a school bus driver, should they be able to look at your driving record?

Society has always benefited from the fact that personal information was hard to gather. And now technology is eliminating that. Divorce trial records are public record, but that used to be a moot point, because they were sitting in a warehouse. Now they're searchable on the Web, so we really have to revisit this thing. The community I'm living in has a heyday publishing people's electric and water bills, because those happen to be public utilities, depending on where you live.

This is a very legitimate issue. One of my competitors said, "Get over it, you have no privacy." That is absolutely the wrong approach. We'll never get this technology to be used for people's convenience if we don't design it with privacy in mind. We have an effort called Hailstorm that lets you share your information across your devices. That's our first initiative in which the privacy design has to be the key thing we do, or it just won't be successful.


Interview with Scott McNealy
September 4, 2001
By  Michael J. Miller

Scott McNealy, chairman and CEO of Sun Microsystems, cofounded the company in 1982, not long after the introduction of the IBM PC. Over the last 19 years, McNealy has been on a mission to prove his company's slogan: "The network is the computer." According to McNealy, the full value of the computer can be realized only by linking PCs, workstations, servers, and every other form of hardware.

Not surprisingly, Sun's mission has put it on a collision course with Microsoft, a company that views the PC as the heart of the computing paradigm. This has also led McNealy to champion consumer choice, technology competition, and the open-source movement. "Through the community process," says McNealy, "technology matures faster than it would otherwise. But the real story is that the PC is no longer the center of the computing universe."

Michael J. Miller: In retrospect, what significant changes resulted from the advent of the PC?

Scott McNealy: The PC represents the beginning of popular distributed computing. At first, most people viewed PCs as standalone, personal productivity machines. It was our good fortune to recognize right from the start how much more could be accomplished by linking desktops together. That's the great significance of TCP/IP and other open standards. In fact, the standardization of interfaces led to the commodization of parts, which dramatically lowered system costs. Standards such as TCP/IP, NFS, and HTML have all proven successful for the same reason.

Michael: In the face of the PC, how was Sun able to carve out such a unique and strong position?

Scott: We brought a novel approach to the design and functionality of the workstation. As its power and breadth grew, we were able to apply our unique architectural approach to servers, which in many ways "commoditized" server systems.

Also by combining the hardware and software into a cohesive unit, we're able to produce tightly integrated and highly optimized systems. And it's critical to own all of our key intellectual property. Without that, we would be at the mercy of someone else for the future innovation of our key components.

Michael: Yet in Sun's approach to software, it blurs the line be-tween traditional open-source and proprietary technology. How does this work in your favor? How do you strike the right balance?

Scott: Sun is the most open technology company I can think of. We've always published our programming interfaces so that anyone can make a compatible product. We've also taken that openness a step farther in recent years by making the source code for Solaris and our other software freely available. That way, developers know there are no hidden "gotchas," and customers know there are no proprietary lock-ins.

We are a big supporter and a big contributor to the opensource movement, especially with products like StarOffice productivity suite, JXTA P2P protocols, and so on. The reason we've maintained a stewardship role with Java is to preserve its cross-platform compatibility.

Michael: What about going forward?

Scott: We plan to use that innovative approach to attack other opportunities, such as storage and peer-to-peer computing. We have an open-source, peer-to-peer effort called JXTA, which is being spearheaded by Bill Joy. Imagine turning network congestion into a good thing! That's what this does. eMikolo Networks demonstrated one of the first JXTA applications at this year's JavaOne conference. You saw it; performance increased as more peers linked to a video on the Net. That's pretty cool.

Michael: You've said, "The PC is dead." With so many people still using PCs, and so many who still don't have them, why do you think this is true?

Scott: Clearly, people aren't about to throw their PCs out the window, though I'm sure they often feel like it. On the other hand, a lot of people and organizations are starting to realize that maybe they don't need that kind of technological hairball on their desks; there are much simpler information appliances available for the home, the office, and the classroom. Plus, people will increasingly access Web services through mobile devices—wireless phones, pagers, and PDAs—more than they will through PCs.

Michael: Where, then, do you see the most important innova- tions occurring?

Scott: They will probably evolve around open standards, where network effects come into play. Just as you could say TCP/IP (the Internet protocol stack) was the most significant computing innovation in the past 20 years, new standards like XML will likely have far-reaching effects over the next 20.

I'd also keep an eye on what's happening in the peer-to-peer or distributed computing space. Think of it as a Napster for more than music.

Michael: What factors will fuel these future innovations?

Scott: Bandwidth. Since 1975, people have been amazed at the increases in microprocessor speed. In that same time frame, bandwidth doubled every 16 months—and even much faster than that in the past few years. Greater bandwidth is going to open up all kinds of new possibilities. Everything is going digital—music, movies, TV, photography—and it's going to converge on a single digital network: the Internet. It will transform broadcasting into unicasting. You'll have one channel to watch—the one you program yourself. You'll watch what you want, when you want, and you'll have the bandwidth to access it.


Michael J. Miller and Craig Barrett: Uncut
September 4, 2001
By  Michael J. Miller

OnlineExtraOnline Extra

Since joining Intel Corp. in 1974 as a technology development manager, Craig Barrett has risen through the ranks of the semiconductor maker, serving as a senior vice president and then executive vice president before becoming CEO in 1998. Barrett, a Stanford engineering graduate, is fearless and optimistic about his company and the economy. He is spending heavily on R & D and banking on a rejuvenation of the high-tech industry. "I'm investing in what I know is going to happen," says Barrett. "The Internet is going to build out, the communications infrastructure is going to build out, and the U.S. will recover."

If Barrett is right, the international adoption of the Internet, the concept of the "extended PC," and a new generation of natural-born techies will continue to propel the significance of the PC well into the future.

Michael J. Miller: If you were to cite the single most significant factor in the high-tech industry over the past 20 years, what would it be?

Craig Barrett: If you look at the impact that computing in general-and the PC in particular-has had on the semiconductor industry, it has been phenomenal. It has been the premier force driving the great bulk of our industry for the last 20 years, whether it's microprocessors, or memory or peripheral chips, or chips that go on devices peripheral to the PC.

Michael: Over that period, what was the most important change?

Craig: Probably the most important thing is the underlying foundation. You have Moore's Law, which basically says you get more and more functionality, which translates into more and more performance, and more and more performance continues to drive the end user to want more capability. That drives the industry.

Michael: But what about the potential limits to Moore's Law?

Craig: Right now, the only limit that I see is the Department of Defense, which is apparently standing in the way of some next-generation lithography. Other than that, there's no fundamental limitation for several generations.

The economics of the standard CMOS on silicon are phenomenal; it's extremely difficult for anything to replace those economics. We've already demonstrated working transistors down in the 30-nanometer range, and the estimates from the SIA (Semiconductor Industry Association) roadmap and from others are that we have another 15 to 20 years of scaling. I think basic, old silicon has a lot of life left in it. We'll move to something else when we reach the limitation of being able to scale the CMOS transistor.

Michael: That's why the next-generation lithography techniques are so important.

Craig: Whatever replaces silicon will probably replace only the fundamental switch. The processing equipment and the wiring requirements are going to be pretty much the same as they are today. If you go out ten years, you're still going to have to put multiple layers of metal down for the wiring. If you go to gallium arsenide or quantum dots, you still have the lithography requirements to define that very small switch. So we need to make advancements in lithography. Whether it's a silicon device or some other device, you'll still have thin films that you'll have to put down.

Michael: For years we've been hearing about radical departures from the standard, like using gallium arsenide in place of silicon. When do you think we'll see this?

Craig: Everyone knows that electrons move eight times faster in gallium arsenide than they do in silicon, and therefore present a much higher potential for high-frequency, high-signal rates. But the cost of the substrate material and of processing doesn't match the economics of silicon. So we continue to improve the CMOS transistor and drive it forward. If we run into limitations, you know it may be at the very high performance end. Gallium arsenide does have a niche there.

Michael: No matter what substrate we use, what emerging technologies will impact the future of computing?

Craig: I think you'll see combinations of existing technologies that will improve performance, power consumption, and form factor. Today, we have separate technologies for memory and for logic and for other capabilities. I think we'll see more mixed capabilities on a single substrate-combining a processor and a DSP with flash memory, for example, for cell phones.

Clearly, there will be a need for higher-frequency performance as we go from one- to 10- to 40-gigabit Ethernet connectivity, so you'll see permutations of the standard CMOS technology, whether it's silicon, germanium, or something else. I think there will also be a greater concentration on power consumption.

Michael: But improving power consumption has long been a major obstacle for the industry. Why?

Craig: We put more and more transistors down, and we flip them at increasingly higher frequencies. They just dissipate more and more power. The challenge the industry is facing is finding techniques and technologies that allow us to create higher performance at lower power capacity.

Michael: Some argue that we don't need more processing power for current applications. So what will drive people to upgrade and keep buying faster machines?

Craig: The concept of extending the PC to rich media will drive the need for more processing power. Look at any of the rich communications capabilities that we have today-digital video imaging, entertainment, even animation. Those benefit from more computer power. Take what we're doing here. You stick the tape recorder in front of me to record my voice, but you also pull out your piece of paper and a pen to write down my thoughts. Those are redundant activities. Why aren't you just taking my voice on your recorder and then using a voice recognition system to automatically translate it verbatim?

Michael: Good question. It would make sense.

Craig: And that's just English to English. Suppose you spoke Japanese. We could do English to Japanese, real time. Voice to text to text to voice, right? Computer intensive; it makes your job easier.

Michael: As much as I like that theory, why?

Craig: The question is, do you ever want to consolidate the information? Do you want to consolidate access points into the Internet? Do you ever want to consolidate? Where do you want to consolidate? Do you want to share files? If you want to "rip" CDs (I hate that phrase), how much is your time worth to you? If doubling the processing power halves the time it takes to rip a CD, it may be worthwhile to you. If you want to edit videos, or you want to compress videos to send them over the Internet, is that worth your time? It's all a matter of personal preference. How much is your time worth? If you just want to type and do text, you probably can't beat an old IBM Selectric.

Michael: How important does the standard PC remain in an environment filled with peripheral devices like PDAs and tablets-especially as they increasingly take on PC-like functionality? Have we reached the end of the PC era, as many people suggest?

Craig: Every year we hear the hype about the post-PC era, but the truth is that we just sell a bunch more PCs each year. The flexibility, capability, upgradability, and constant forward momentum of the PC just seem to win the day. I think people increasingly realize that the value of PDAs and mobile devices is in the role of peripherals or adjuncts to the PC. The PC remains the central digital brain in this whole system, and all these other things will work around it rather than replace it.

I don't think the PC is in danger of being replaced by anything. The excitement surrounding a lot of these new devices stems from their ability to be interactive with and act as a peripheral to the PC. They basically make the PC more valuable and more indispensable.

Michael: This year has not been a great year for PC sales. What will turn it all around?

Craig: The international build-out of the Internet. When people talk about the saturation of the PC, they're talking about 5 percent of the world's population- the 300 million or so that are the US. They're not talking about the other 95 percent of the world. Europe has half the PC penetration that the U.S. has; the rest of the world has much less than half. So the build-out of the Internet will continue to drive the PC.

You know we're still only 20 years down the road of the PC. It's kind of like growing up with a bicycle. Everybody who grows up with a bicycle knows how to use it and can use it. The younger generation is growing up with PCs. They are the people who rip CDs, communicate with one another, and take full advantage of their computers. They are the people who never, ever, ever ask, "What do we need more processing power for?" They are the people who say, "Give me more memory, give me more processing power, give me a better user interface, give me better graphics."

They are not the people who write newspaper articles.

Michael: Still, when I go out on the road, I'm not hearing the enthusiasm for buying new machines.

Craig: Well, of course not. The U.S. economy got way overhyped for a variety of reasons. It got overhyped from the dot-com standpoint, from an e-business standpoint, and from a communication infrastructure standpoint. So the build-out probably got somewhat ahead of what the system could absorb, and meaningfully make use of.

Michael: So how long does it take to absorb that overcapacity?

Craig: Let me quote Alan Greenspan. Mr. Greenspan says, "I could tell that this correction was coming, but we've never had one associated with this technology investment before. Therefore I couldn't predict when it would start, nor could I predict what its shape will look like, nor could I predict how deep it will go, nor could I predict what the recovery will look like. But we will study the hell out of this one, so that next time we'll be able to make more intelligent comments." Nobody saw this one coming, so everyone established credentials up front. And anyone that's predicting when it's going to end should be required to go back and show that they predicted it would start.

Michael: Well, clearly, you believe that we're coming out of it. You're investing a lot of money and capital in R & D and manufacturing.

Craig: I'm investing for what I know is going to happen: The Internet is going to build out, the communications infrastructure is going to build out, and the U.S. will recover. Whether it's 3 months, 6 months, or 12 months, it will probably depend on what aspect of the high-tech economy you look at and whether you look at the networking infrastructure, the computing infrastructure, or some other parts of the infrastructure. They'll probably all recover at different rates.

We already look at the computing infrastructure and say, hey, inventories are reasonably balanced; it doesn't look like it's going to fall anymore; it's kind of flat. We expect a seasonal uptick in the second half of the year. What else is new? We can talk to [Cisco Systems CEO] John Chambers. John Chambers says, "Woe is me, I don't see any end to this thing." So, there are two different sectors of the high-tech economy that are giving you different viewpoints. So there's probably not a single answer.

Michael: Twenty years from now, what will be the thing that has changed the most about the way we use technology?

Craig: Well, first of all it's going to be much more commonplace. We still think of the computer, photonics, and the Internet as new and unique. Twenty years from now, they will have been here for 20, 30, or 40 years, so they will be totally ingrained in every aspect, every age group, of the social structure. We won't have the classes of the population who didn't grow up with the technology. We'll all have grown up with this technology, or a variation of this technology. So I think it'll be much more commonly accepted, much more commonly used, and much more commonly appreciated.

Just like today, how many people do you find that do not have a driver's license? What fraction do not have telephones? And then we make this great deal out of, you know, only 60 percent of the homes have PCs. I think 20 years from now it'll be like telephones, television, and driver's licenses. Everyone will have an understanding and utilization of the technology. And will it do new and wonderful things? Of course. And will all those things be extended phenomenally in the next 20 years? For sure.


IBM's PC: Then and Now
September 4, 2001

OnlineExtraOnline Extra

For some perspective on the changes the IBM PC has undergone in the last 20 years, Michael J. Miller spoke with Robert Moffat, IBM's Personal and Printing Systems Group General Manager.

Michael: It's been 20 years since the IBM PC was announced. How important do you think that announcement was to the industry and to IBM itself?

Bob: As an industry statement, I think it launched the era of modern computing. It dramatically changed the computing model, but more important, it changed the user experience.

From an IBM standpoint, we probably missed the significance of it at first because our five-year forecast was to sell only 240,000 units. Yet, I think that was essentially the first month's orders. We didn't look at it as a solution play and what it did for users. I think we've learned something, though. Over the past 20 years we learned that it really is the solution that you bring to the customer, not simply a piece of hardware, or the piece of software that makes the difference.

Michael: What would you say IBM did right and wrong as far as the PC market was concerned?

Bob: I think when IBM launched the PC, it had the absolute right concept. We were going to develop something that was totally open: an open hardware platform and an open software platform to enable the software providers to provide a wealth of applications. And I think we did that well at the beginning.

I think we made some missteps along the way in trying to make pieces of that proprietary-the most obvious one is Micro Channel. But we also did a lot right. We've brought a lot of technology to the PC, such as the capability to create thin-and-light notebooks. We also continued to press the envelope with the introduction of TrackPoint, which is out of our research division, as well as the introduction of cooling technologies in notebooks, which quite honestly make our notebooks highly reliable.

Michael: What has spurred the biggest change since then?

Bob: Well, everybody says network computing or the Internet, right? GUIs got better. Ease of use improved. But these are all things that, quite honestly, focus on lowering the cost of computing to the end user-speeding the deployment of things-whether that's getting boxes out quicker to people or allowing people to come up to speed on applications.

Michael: So, where do we go from here? We're clearly in a world where I hear a lot about network computing.

Bob: We strongly believe this is a network world. We also believe that new technology leaders emerge. They have every time there's been a new change in the computing model. In addition, we believe it becomes much more of a services-led business, as opposed to a hardware- or software-led business.

Underneath all that, the network world means one that is filled with open hardware and software standards. Not de facto standards, but truly open standards. Why? Because what you're talking about in a network world is devices of all kinds being able to link into this infrastructure. It's going to dwarf the number of PCs that are out there.

Michael: At this year's Networld+Interop, there were lots of appliance devices: server appliances, firewalls, Web servers, caching servers, LANs, you name it. How's the balance going to shake out between those kinds of appliances on the server side and traditional multipurpose servers?

Bob: The biggest multipurpose server that I think you're talking about is the 390. It's a specialized server for transaction computing. It's the place where a lot of the heavy lifting goes on in an enterprise.

It isn't a specialized device in that it isn't there just to do caching. (By the way, I think you are going to see a large number of those specialized devices being deployed. There is a requirement to have specialized types of servers out there doing things because you can obviously optimize to that.) But when you step back, the balance will be determined by how easily some of these things integrate into the middleware. The gating factor will be the skills to do that.

Michael: What's the future for the different kinds of devices-for the traditional PC or the traditional notebook?

Bob: You've asked a very difficult question because I don't think there is a clear, stated course that it's going to head down. The PC is in a mature market. The days of astronomical growth rates associated with it are over. But it will still play an important role in the computer paradigm. Guys like me will continue to carry notebooks around. However, it won't be our only access device.

It is not just the notebooks that you and I carry around, but it's an integration of many things-I can go to my PDA to see what the heck is going on in my house. I'm excited that these devices will simplify life. Here's an example: the ability to call your Palm Pilot and have it translate English into multiple languages. When I was in France this would have helped me with my ability to communicate.

Michael: What applications do you think are going to be the big technology drivers?

Bob: Nobody in my group likes to hear me say it, but the PC by itself is uninteresting. It's a device. Broadband and access to information: That's going to be the application. It's terrible to call that an application, but it's going to be the next huge, enabling technology, because it will facilitate a lot of applications.

We live in an incredibly data-intensive, information-intensive environment in which a business cannot sustain itself if it has to wait a long time to get information. Information needs to be assimilated quickly and acted on quickly, no matter where you are.

I'm amazed at what some of the applications are and what the Internet has opened up as far as capabilities for the next generation to learn in incredibly different ways. Watching the Internet break down a number of cultural divides will be really interesting. The world is clearly getting smaller.


From the Archives: Bill Gates
March 25, 1997

(Editors' Note: This article originally appeared in the March 25, 1997 issue of PC Magazine.)

Remembering the Beginning

PC MAGAZINE: 15 years is a surprising amount of time in some ways. Obviously the PC industry goes back further, but the IBM PC changed it a lot. It changed the way people thought about computers.

BILL GATES: It absolutely did. For the IBM PC, the key market was a hobbyist market. We'd gotten floppy disks, so we'd start to get into some more serious applications. But, both the power of the machine, because it was the first 16-bit machine, and the endorsement by IBM of using a personal computer, and then the critical mass of software and distribution and peripherals that came along with that, makes it the biggest milestone in the history of personal computing. It was a sea change.

Of course, one of the big developments around it was that before the IBM-PC came along, machines were essentially incompatible with each other. In those first few years, [we saw] the whole notion of how compatible was compatible, there were many different levels there. And a lot of hardware manufacturers still felt they had more freedom to deviate a little bit and do things that couldn't be virtualized by a common software interface.

So, in those first two years a lot of key things were established: the idea of compatibility; MS-DOS as the primary system as opposed to CP/M-86 or UCSD P-System which were the prime competitors; and the whole thing about a broad software industry and using indirect distribution channels. Anyway, it was a monumental change.

PC MAGAZINE: Let's go back -- go back to that first 1982 PC Magazine interview. You know, "The Man Behind the Machine." Microsoft was involved with IBM to what at the time was an unprecedented degree. You had written software for lots of other machines at that point, not operating systems but BASICs and things like that. How different was it?

BILL GATES: Well, for IBM it was extremely different because this was a project where they let a supplier -- a partner, whatever you call us -- shape the definition of the machine and provide fundamental elements of the machine. When they first came to us, their concept was to do an 8-bit computer. And the project was more notable because they were going to do it so quickly and use an outside company. It wouldn't be a high-volume product.

Their charge was to do a home and a very low-end business machine. They had the Data Master, which was 8085-based at the time, that they felt was covering part of the business market. Then they had the 5100, the machine that had both an APL and a BASIC interpreter, which was covering another part of the business market. So it was sort of a home-down business machine. And it was a very small team in Boca that wanted to prove that IBM could do a product, any kind of product, in about an 18-month cycle. And that was the most novel-that was going to be the novel thing was: could you work with outsiders, which in this case was mostly ourselves but also Intel, and do it quickly?

And the key engineer on the project, Lou Eggebrecht, was fast-moving. Once we convinced IBM to go 16-bit (and we looked at 68000 which unfortunately wasn't debugged at the time so decided to go 8086), he cranked out that motherboard in about 40 days. It's one of the most phenomenal projects because there were very small resources involved and we had to ROM the BASIC which meant that it was essentially cast in concrete, so you couldn't make much in the way of mistakes. We actually did this clever thing where for disk versions of the system, we put enough hooks in the ROM that you could place reasonably modular parts of the ROM. So it was very lucky when it turned out about a year after the machine shipped, there was a floating point bug and the New York Times ran it, we could just issue a disk that patched out that part of the floating point package because virtually all the machines that had been shipped were disk-based,. What I had done was made the dispatch table hookable. It was a very tricky project because the machine had to boot running only BASIC, or if it detected a disk, it had to boot with the operating system. And if only the BASIC came in, the we had to do file management against the audiocassette. And they insisted that it run in a 48k configuration which was pretty tricky; we were hoping they'd insist on 64k. Now, it turned out most people bought 128k versions of the machine.

PC MAGAZINE: Didn't the original machine come out with 16k?

BILL GATES: The cheapest machine you could buy -- yeah you're right, you're right: if you were cassette only, it was 16. And then if you got the disk you could be 48. And, it was actually fairly complicated because the ROM is up in very high memory, that is, it's at a different segment address. The whole idea of where we use long addresses, where we use short addresses. The 8086 architecture -- now that we have flat memory model, people forget that a segmented memory model is a fairly complex thing to work with. It actually got worse before it got better. We went to the 286 which was very segment-oriented before we finally got just straight, linear address space in 1986.

PC MAGAZINE: And not in the operating system until after that.

BILL GATES: That's right.

PC MAGAZINE: That's significant.

BILL GATES: Well, people had written applications to use real addresses -- DOS applications depended on the fact that there was not a level of indirection: that is that you simply shifted the segment value and added it in.

So we had to keep running Real Mode Applications-I mean today you can buy an IBM PC and run a Real Mode Application. I mean it'd actually be interesting to take those applications that shipped with the original PC and plug them in. I'm pretty sure they'd work. I admit I haven't tried it. The first applications were an adventure game from us, a typing tutor from us, and then VisiCalc.

PC MAGAZINE: Yes and Easy Writer.

BILL GATES: Was Easy Writer there at the very beginning? I think it was. So none of the early applications have successful follow-ons. Now the early language tools included our BASIC compiler, our FORTRAN, and actually at those times we offered a PASCAL and a COBOL compiler as well.

PC MAGAZINE: Obviously, back then you were talking about DOS as the standard. I mean you were telling people to write to MS-DOS, not to go around it to the hardware and all that. Of course, the programs that became most successful in those days of the applications, did go around it. IBM became hardware-dependent, which of course was then copied a lot. Was that a surprise to you?

BILL GATES: No, not at all. I mean, remember we were the big promoters of the bit-map graphics. We were really against this- there were two video cards. There was CGA, which we pushed for, unfortunately only got 640 by 200 graphics into it and the palette was limited. It was just at the last minute they gave us a tiny bit of a palette where color 0 could be mapped to 16, and colors 1, 2 and 3 had a flip where they could be one set or another. Anyway, the character cells on the CGA were 8 by 8 so they didn't look as good as the Character Mode Only card. Because that was actually, although it didn't have bit-map graphics, what IBM calls All Points Address (APA) graphics. So it was actually 640 by 350 so the character cells were much larger and looked a lot better. It was the Display Writer character cell size. So IBM thought a lot of people would buy that character mode card, which meant you couldn't do graphs and things.

But the only way to do graphics was to [write code directly to the hardware.] So all our applications that did graphics did that as well. It was only later that we decided we could provide some high-level services and get enough efficiency that graphics applications would go through the operating system.

The definition of what it means to be a graphics application changed. In the early days it meant you had a mode where a chart would come up. Like when you ran Lotus 1-2-3, most of the time you were in character mode, where the scrolling was very fast. Now at days we have processors that are fast enough to scroll the bit map display and you don't even think about it. Back then, just scrolling the display in graphics mode, was noticeably, significantly, painfully slower than scrolling the display in character mode because you have eight times as many bits to push around and these processors were quite slow processors.

Working with IBM

PC MAGAZINE: No question that a lot of things have changed. In the beginning you had a really tight relationship with IBM. And that changed actually fairly quickly. In '83 when you announced Windows, IBM was the notable exception among those supporting it. And then of course you got back together after that with a joint development agreement. What was it like working with IBM at those points?

BILL GATES: Well, in the early days we had more people working on the project than IBM did, and we were just working with them to get the thing done. They weren't really IBM; they were a renegade group inside IBM.

Then right after the product got launched, there was a struggle to see whether the group in Austin who had these various business computers, would take over the group in Boca. And much to everybody's surprise, IBM management decided that Boca would take over Austin. So Don Estridge was put in charge of the whole thing. So he went from having a group of 50 people over a period of two years to having a group of 5,000 people. Even for the people inside IBM-in the whole nature of the project and the ease of doing new things-getting things done got pretty complicated.

Right after the machine came out we started on the version of DOS that had the hierarchical file system--version 2. And that was the hard disk. They came out with a 10MB hard disk and then a 20MB hard disk, the so-called XT machine. By the time we got to the AT, IBM had a very large group and they had started up various projects to displace Microsoft. People were doing their own operating systems. And there are some good stories about that because actually Mike Maples, who we later hired, was in charge of one of these new operating system efforts. So it was a little trickier working with IBM. We still had quite a close relationship.

The fact that they chose to do Top View-there was some work in their lab, which was a character-mode windowing system. And it had some nice things. It had sort of indirect dispatch, now people call it object-oriented--actually they called it object-oriented at the time--it had a certain extensibility. But because it was character-oriented, we thought it was a dead end. IBM was nice enough to let us really come in and talk to them about that.

They listened hard about Windows. And the fact we were doing Windows, they were doing Top View, that wasn't a big problem with our relationship. They did get into a period with the AT though where they thought they were going to do their own operating system. In fact, there was a special key on the AT, SysReq, which was to call their HyperViser in. And so it was going to be like VM on top of multiple DOS machines.

Unfortunately, there was a problem, a technical problem that it didn't work. Then they called us in to help them solve the technical problem, which was a speed problem and we were able to improve it a lot. But it was never possible really to do a HyperViser on the 286. Anyway, it was still fun to work with IBM. It was a challenge to stay up with all the politics inside the various groups and understand what we needed to do to be the top supplier.

But as it got larger and larger, it was just a very different organization. And there was a real change in their mentality in 1986 because they'd had 70 or 80 percent share of the market. And people like Compaq that had sort of been on the fringes -- but then by the time you get up to 1986, IBM share was dropping quite a bit. I think that's the year they dropped below 50 percent. So their whole view that they have to do something that was totally unique and would put the other guys out of business, that changed. And there were big debates within IBM.

But to the degree they wanted to do super-proprietary things working with us didn't make us because we had chosen to be a supplier to all the different hardware manufacturers. And so there was some tension having to do with that, including whether the big multitasking system, whether they would work with us or not.

During its development it wasn't called OS/2, but when it was introduced as part of PS/2 the week before it was introduced we found out that that was the brilliant name that had been, they'd come up with.

PC MAGAZINE: What did you hope it was going to be called?

BILL GATES: Oh, we didn't care that much about the name. IBM did the contract so that we didn't have very good rights to the name. So then later when we went our own ways, when they divorced us, much to our disappointment, we'd been working on the 32-bit version which was called OS/2 3.0. And so all the advance work was here, OS/2 3.0. But we had to rename OS/2 3.0 Windows NT.

Now, by then there was a certain logic to that because Windows was gaining marketplace acceptance and OS/2 was not gaining marketplace acceptance. So they kept the name because the way they structured the contract. We had very limited rights to the name. And we even debated whether or not the other licensees should use that name or not. Eventually decided they should. Anyway, it was a tricky, tricky problem.

Acknowledging the Internet

PC MAGAZINE: You had a bit of a scare, a bit of a change of direction, when other companies saw the Internet and started doing things like browsers before you did. In some respects, you changed a lot of strategies around the last few years. What did it feel like to see other companies doing things ahead of you?

BILL GATES: Well, I think the Internet is probably the most dramatic example of where we've had to change strategy. But I think our ability to stay ahead is throughout the years has been due to having a lot of flexibility. Take graphic interfaces. Xerox created, and deserves the credit for, the basic concept of graphic interface. Now, it turns out they didn't exploit it very well, and it was Apple and Microsoft who were able to do that.

In the case of the Internet, it wasn't any existing companies who were doing good things. It was the phenomenon inside the universities and Netscape, which was started by people who had been part of that university phenomenon. So as that wave got going, that's where they were.

The Internet was fascinating for us because as early as late '94, we actually shipped an add-on to Word to do HTML support. And we had started building TCP/IP into the system and decided that was going to be the standard protocol. But every time we'd go out and do an Internet retreat, we'd increase our view of what Internet things we want to do. But they were never the top priority: getting Windows 95 done was the top priority, getting Windows NT to critical mass in the market was the top priority.

And it wasn't until really late '95, when we looked up and said, "Wow, the Internet is the vision of PCs as a communication tool coming true." It's coming true with protocols that are 20 years old, and we wouldn't have guessed that that would be the case, and some nice things have been added on top of it. But fundamentally, the cost of communication coming down had lead, at least in pockets, to this critical mass being achieved.

And so we said, "Let's make it the top priority." And "What would we change if it's the top priority?" You know. How would MSN change? How would our browser strategy change? How would our server strategy change? What new groups, like Merchant Server, would be put together? So that was a pretty big shift. Now our employees, because they'd been out surfing the net, were pretty in tune with what needed to be done. So a lot of good ideas came along and it only took a month or so before we had an internal plan. And then December '95 was when we articulated to the world all the steps we were going to take during '96 which we were able to execute better than I would have expected. The Internet is still full of surprises. It's not like you say "OK, we're done with the Internet."

The Internet is the place where all the neat new things are happening. So whether it's multicast or better video compression or security technologies or 3-D browsing, personalization servers, server development, site tracking. Anyway, there's more to be done on the Internet than has been done to date. The success of Windows NT, the success of Office, the success of Windows 95, those give us a good basis to build on because our strategy is whenever possible to integrate Internet functionality into those products, and that's a primary thrust. So that when somebody buys a PC, and they want to use the Internet --whether they want to use it for collaboration or voice telephony or document sharing, whatever they want to do--we want that built-in, so that they don't have to go out and try and integrate another piece of software.

PC MAGAZINE: Over the years you've talked about a number of products and a number of projects and things like that and a lot of them of course don't show up exactly when you think they're going to. They come up a little later. Is predicting when this stuff is going to happen getting easier or getting harder?

BILL GATES: Well, as Microsoft has gotten larger, we've gotten the ability to fund a pure R&D group. So we can have very risky products that you can't predict a date for.

Take voice recognition. If you go over to the lab you'll meet some guys over there who think "Hey, in two years, no problem." Now, because it's their job, they're allowed to be optimistic. But our business plans aren't -- you won't find that dependency written in. And we're willing to fund that work for however long it takes to get it done.

We've got people working on vision; we still have people working on handwriting, which most people got excited about and then decided it was a hopeless thing. We still believe that that will get conquered. So we have more luxury in being able to pursue a longer time horizon and therefore a more uncertain time horizon, and even to have research projects that don't pan out.

The industry continues to move very rapidly, so you're going, there are going to be surprises. In terms of hardware breakthroughs, nowadays it's real easy for us to go to Japan and tour all the big company labs and see what's going on there. And people are very anxious to come and talk to us about what they are doing, as people are coming up with better digital cameras or flat panel displays or as DVD has been coming along, because people see an opportunity to work with us to get great support for those things in Windows. There aren't too many surprises on the hardware side.

Take Intel. We're collaborating with Intel on P7, the 64-Bit Merced, IS-64 HP thing, many, many years before that comes out. We've got a team doing that, larger than the team we had working on the IBM PC. But numbers are all different nowadays.

PC MAGAZINE: Windows was later than you thought, so was Windows 95. For years you have talked about getting software on a more predictable schedule, like every year there'd be a new version of Office, or something like that. Is that getting easier to do? Is it getting harder to do? Obviously now you can do things like deliver pieces over the Internet.

BILL GATES: I think there are groups like the Windows NT group that have been delivering on a very predictable schedule. It always depends on whether you set the schedule as the key thing or if you pick a very ambitious set of features. In that case, you're going to need to take the time to do the beta test and during that beta test you might hear about some additions or improvements that you think are worth making. Nowadays it's pretty nice because we have two ways of getting products out. One is to get them out on the Internet in a component by component type fashion. And then the other is to take all those advances, really test them as a full system, get the user interface to be nicely integrated, go out to developers and explain how that's going to be out there in very large numbers and then make a very large release. NT 4 was very much like that-we have upcoming versions of Windows it'll be like that as well.

So you really get the best of both worlds. You get the neat new things are out there on the Internet, and the things where the foundation's really changing and the way the pieces relate to each other are changing, those come depending on the product something like every couple of years. We'll still have products that are ambitious enough that we can't predict the date exactly two years in advance. The Office group has done very very well with their predicting when they are going to do things.

The Coming of Cairo

PC MAGAZINE: A while back, you talked about an object-oriented system that you called Cairo, and at one point it was going to be a product. At least it certainly sounded that way. Now it's like a set of technologies some of which are here, some of which are very soon, some of which aren't here. The overall combination seems like it's a long way off. Is it still a goal?

BILL GATES: Almost all of it is here today. And like all good things in technology, people take it for granted.

Object orientation is in the system.You can put an object of any type in a Word document, an Excel spreadsheet. The standards for how you do that are very straightforward. Eight years ago we were talking about that was going to be a very tough thing and now it's just the way software's developed. It's components that fit into containers and it makes things very very extensible.

And we've come up with things like the Microsoft Foundation Classes that make it quite easy for programmers to do those things. For Visual Basic programmers, they just take it for granted that if they want to do a Rich Form, they can go to a third party and get a control and embed that control into a form. The reason that works, that's object orientation. There's a class protocol that's developed for how that control relates to its container, and it's richer and richer all the time.

The only thing of all that vision that's not in the marketplace is the file system and directory -- the rich file/system directory combination which is now part of the NT 5 product. We actually put a developers' release of that in people's hands in November in a professional developers conference we had... And so later this year that'll go into beta testing.

Having the rich storage system with the directory -- that was part of that Cairo vision. And so although a lot of the Cairo things have been done, that's the one that we're still working on. Today when you think about storage, you think about storing messages as one thing or addresses as another thing or user objects, machine objects as another thing. Anyway there's just too many ways that people are storing things and having to learn utilities and different security, different replication, different enumeration, query. Right now there's two grand unifications taking place: all the presentation is being unified around a sort of a super browser that takes over the shell, and then all the storage is being unified around a sort of a super file system that takes over a lot of those functions. The storage unification is the harder of the two, but they're both very important and will make the system more powerful and easier to work with.

Windows APIs

PC MAGAZINE: You look at Windows today, and Windows is now available in various flavors from CE up to NT Server and various kinds of devices. Are there limits as to how far you can adapt Windows? At some point do you need to transition to a completely new set of API's? I mean CE has a limited subset; NT has the Superset at this point. At some point does it have to transition completely?

BILL GATES: NT is the only from-scratch commercial operating system that's been done in the last decade, and it was done based on the latest ideas of how you make things multi-processor and very extensible. And we put the more money into advancing NT than is put into any operating system whether it's MVS or all the different flavors of UNIX put together. That's a very fresh piece of technology and this year it'll get things like clustering and 64-bit.

And the question of whether PCs can take on the toughest computing tasks will be answered for once and for all. People still see UNIX as being higher end or mainframes as being higher end, but this is the year where the pieces fall into place in both hardware and software advances that that will change. NT will be the foundation for the next decade. It's a very flexible, extensive base to be building on.

Over time, a lot of the operating system will become visual recognition, speech recognition and actually the logic representing those very advanced input systems would be more complex than even what you think of the operating system being today. You'll still have that operating system there, in fact it will be even better than what we have now, but this whole new element will come in to focus on the kind of adaptation that's necessary for the rich input systems.

Setting Hardware Standards

PC MAGAZINE: In the early days, you pretty much set a lot of the software standards. But IBM was clearly defining the hardware standards at that point, in terms of the AT bus, EGA, VGA, 31/2 inch floppies. No one hardware vendor does that these days. Why do you think that happened?

BILL GATES: Well there's Intel.

PC MAGAZINE: There's Intel, but. . .

BILL GATES: No, seriously, when it comes to PCI, Intel deserves the credit for that. Now they're doing AGP, which is a good advance to get graphics performance up to a whole new level. Yes, some are on the boundary between software and hardware like ACPI, which is the power management that it's us working with Intel, or Plug-and-Play. That's one that actually Compaq got involved with and helped out quite a bit.

The two companies who really want to grow the market the most are going to be Intel and Microsoft, so it's in our interest to create these new frameworks. And we have a whole conference called The Windows Hardware Engineering Conference that we do around the world which is to call people together to talk about what are we missing.

What should be done for the PC? Because the attendees there have the magic of semiconductors and there's literally hundreds of very smart companies coming in and we get great ideas about where should audio go or where should 1394 go.

USB is another one where Intel was the primary driver. 1394 is a more broad thing because that's one where we're bringing in the consumer electronics industry as well as the PC industry. So companies like Sony will be very important there.

I think there's as much or more leadership at the center of the industry as there's ever been. It's always pretty informal, whoever has great ideas is welcome to come along with it. But where there's a clear vacuum Intel and Microsoft are going to step in.

PC MAGAZINE: You've talked about some things lately in the graphics area like Talisman and things like that. And there are occasions when Intel is saying one thing, you're saying another, some of those areas like that. Is that OK? Is that just the way people toss ideas back to figure out which ones are best for the market?

BILL GATES: We're pretty in-line with Intel in most things. We're working together on USB, 1394, AGP, MMX. Those are all very important extensions.

There have been times where Intel will go off in a little direction other than we do, and it's not the end of the world. In the worst case, you end up with two choices that people have. And very often those things after a period where people sort of see the merit of the two approaches, sometimes you come up with a blended approach where you get the best of the two different things there. I think it's fine that we're not always in lock-step. We do a lot to keep the companies communicating.

Intel would say they really got into this ProShare [videoconferencing] thing early on, and that they were kind of pushing us on that and now with the Internet we've really picked that up and taken it to a new level. So we benefit from each other's independence.

In the area of performance we are always pointing out to Intel are they using the most optimistic benchmarks to look at things and what are the real world mix of applications; making sure that the product groups there have a clear view of what the performance equation looks like. That's a case where our independence helps them get a good view of what they're doing.

Product Transition

PC MAGAZINE: Most of what we're seeing now is building on top of the architecture that we've had for the past 15 years step by step. Apple at one point had to transition from the Apple 2 to the Mac, from the 68000 Mac to the PowerMac. Will we need to see similar changes in PCs?

BILL GATES: Well, the 32-bit address space will let you go up to 4 gigabytes of physical memory, and not many people have that today. You say a typical machine is 32 megabytes, then you've got a factor of 128 of expansion which if you double every two years, that would be 14 years before you run out.

Now, people who are really pushing the limits, like servers, will run out before the 14 years, but that's the only thing where we have to do more than just keep scaling the performance. Of course, Intel's already very hard at work on P7. There's a tricky way to use Pentium Pro where you can actually get more than 4 gigabytes of physical memory. I don't know if that'll become important enough because by the time people are pushing that limit, the marketplace may be moved over to the P7-type approach.

But often when you get to the end of an architecture, there's clever ways to push it, and the Pentium Pro does allow for more than 32 bits of physical address. I don't know if you've looked at that. Unless you have lots and lots of memory, it's not that important.

The Need for Spped

PC MAGAZINE: Machines are getting faster and faster. You have competitors who are pushing the idea that with the Internet and things like that you won't need all that new processing power and you won't need all the operating system complexity. Yet, you look at the processor road map and we're going to have a fair amount of processing power. There's no signs of that slowing any time soon. Do we need that processing power that's going to be there? You've got things to do with it.

BILL GATES: I think the biggest use of processing power will be speech recognition, speech synthesis, visual recognition, handwriting recognition, and pattern recognition that is noticing what you're interested in and learning from that.

Those are all very deep problems where more RAM and more MIPS make a lot of difference. In the near term, just moving up to having MPEG2, 3D graphics, better video, those things will help a lot. Just one point about complexity: there's a pretty important change coming this year is we allow you with the PC to logically to keep software up on the server but from a performance point of view it actually gets pulled down onto your disk. It's part of this Zero Admin[istration] Windows thing. So basically the PC is Stateless. You can get a new PC or you can walk over to a new PC, and because your User Profile says what application you use and what you like on your desktop and everything, you're immediately up and running (although if it's a machine that the applications you like have never been used on, it's a little bit slow at first because they're coming down from the server to that local disk.) Anyway, that's just one of many simplifications that will be going through. Although underneath, the PC is going to have a lot of code working on your behalf, we can hide the complexity. There's a lot of stuff you see in the file system that you shouldn't have to worry about. And deinstalling an application is just too complicated to know; did you really clean up your system, if you want to go back to a previous version, it's very complicated. So things like that -- there's exciting innovation in those areas for even just the next twelve months.

But, processing cycles, they're going to be there. The question is how creative [we can be] about taking advantage of them and that's a question for the whole industry.

It's partly why we've increased our R & D as much as we've had. We've asked Nathan Myhrvold to really be very ambitious in assuming computing power and show us what can be done with it.

Excitement from Software

PC MAGAZINE: Let me ask one thing. In that very first PC Magazine interview, you talked about how personally excited you get about when you get a project done, saying it's a combination of artistry and engineering, and when you finally get it done it's like a part of you that you finally get put together. Do you still get that excited from software or have you done so many programs now it's not quite the same thing?

BILL GATES: I think my job is more interesting now than ever because of the incredible impact that PCs are having. Whether it's empowering people, letting them communicate a new way-the big change element in business, entertainment, education, it's the PC, connected up to the Internet, that's thriving on that. So it's very exciting.

In terms of actual product creation, we pick exactly the people who love creating these products and listening to the customers and building them. I'm at a different level of management where I just get to feed in some ideas and give them some feedback and help make sure that what the groups are doing are very complimentary. So I get a little bit of pleasure out of a broad range of products we're doing. I do envy sometimes the people who get to come in everyday and just work on a product and craft that product and make it great. I can see that kind of excitement. And that's the excitement that leads to the very best work. It's where people push their limits and come up with new ideas.

So, my job is to create a place where it's really fun and really easy to come in and do that. I get a lot of pleasure out of sitting down with these groups. That's why I make sure the majority of my time is spent with product groups. That's what I like and that's where I think I can make a contribution. So the fun is still there, and software creation is still as much a mix of artistry and science as it's ever been.

Milestones

PC MAGAZINE: There are some critical points in the history of the industry where if something happened differently, the whole industry might have been a little different. Is there one crucial point, say, in the relationship with IBM when if IBM had done something differently or you had done something differently Microsoft wouldn't be as dominant of a company as it is today?

BILL GATES: We're certainly a strong company. I mean, we don't like the word dominant. The first milestone was when they came out to work with us. It is important to remember that the PC project inside IBM was not an important project, not a major project, not considered something that really would effect the company. It was a little foray by a small group, and they were more interested in proving they could do something quick than exactly what happened with the product.

So it was rather stunning when later this PC took over the word processing group there, the Display Writer, all the other small business activities they had. After that, the group at IBM we were working at was so large that there were many projects to replace Microsoft, where IBM wanted to do those things internally. There was a major milestone where they adopted a strategy called SAA, Systems Application Architecture, and decided everything on all their computers should be more consistent. Unfortunately that meant the mainframe graphics group, which was in England and had a thing called GDDM, got to say the new PC operating system should use their standards instead of what we'd been doing with Windows.

So we went to IBM and said, "We've got to be able to work with you. You want to do GDDM. We think that's a mistake 'cause it's not consistent with Windows , but we'll be glad to work with you on that." If OS/2 had been built to be a superset of Windows then, OS/2 could have caught on earlier. And there were many points where we tried to get things moving in that direction.

Actually at first with OS/2 they wanted Top View compatibility; we even bought a company to have Top View compatibility and then they finally realized none of their customers really wanted it. It was more a matter of pride on their part. Fortunately, the company we bought was headed by Nathan Myhrvold so he and his sharp people -- just having them at Microsoft was well worth what we paid for them. Actually that was a favor IBM did for us, they gave us reason to bring that group on board.

A big milestone was that IBM didn't trust the 386. They didn't think it would get done. So we encouraged Compaq to go ahead and just do a 386 machine. But that was pretty scary because people thought "Well the standards are all set by IBM, so how could somebody go ahead when IBM hasn't shown how they're going to use the 386." So that was a big milestone.

That was the first time people started to get a sense that it wasn't just IBM setting the standards, that this industry sort of had a life of its own, and companies like Compaq and Intel were in there doing new things that people should pay attention to. It took a while to break this idea that all these non-IBM machines really were compatible. That 1986 milestone was a big part of that.

Then the next milestone was when IBM gets really confident they can do the software themselves. And so they keep the OS/2 name, and they don't end up using all the work we'd been doing to make OS/2 really strong, which became Windows NT.

The whole time we we're working with IBM we expected to set networking standards. Part of the reason we worked with IBM was that clearly they were in a position to set standards. All the things you think of today as Internet standards, most of those are things you would have expected them to come up with, and retain some degree of control and leadership over those things.

Take office productivity software; everybody thought IBM would be the primary supplier there. They had a product called Office Vision which was going to be the centerpiece of making OS/2 popular. But Office Vision eventually just got shut down because it just didn't work. A certain irony that many years later IBM comes back and buys Lotus to fill that hole in their strategy.

Choosing the 8086

PC MAGAZINE: It's interesting. Let's go back a little bit earlier. You had talked about how you helped convince IBM to use the 8088/8086. Some people at IBM have said that they had already made the decision before they came to visit you.

BILL GATES: Oh, no chance. No one says that. Go ask Jack Sams, the guy who would know. It was all about doing an 8-bit machine. It was Bill Lowe. Bill Sydnes was his engineering manager, and Lou Eggebrecht was doing all the real engineering work. And they wanted to do a machine better than the Apple II. When they first came up to visit on the preliminary visit, Intel hadn't made the commitment that they would do a low price 8088. An 8-bit machine seemed like, in terms of the schedule they had which was hard core, it would have been the right choice. By doing a 16-bit machine, they took some schedule risks.

Now, Lou really wanted to do it. In fact Lou went down to Motorola -- cause we were excited about the Linear Address Space of the 68000 - but it just wasn't debugged enough to make the schedule. Intel did come in with the good price and so then it became a 16-bit machine. Actually, it was a rival division had gone to Matsushita to get a machine, which was a Z80 based machine. This machine had many code names. One of its code names was Chess. In the Far East Procurement Project through IBM Japan, was code-named Go. We delivered software for Matsushita for that-then they decided the work in Boca would prevail.

Selling DOS to IBM

PC MAGAZINE: When IBM came to visit you that first time there are a couple of different stories about what happened with DRI [Digital Research] and all that. Did you send them to DRI at one point or did you know when they came there that you could get this operating system? Obviously, you bought QDOS later.

BILL GATES: No, I called Gary [Kildall] and said that I had Jack Sams from IBM with me and Jack was in the office at the time and that Jack wanted to come down and visit him. We had been talking a lot with Digital Research about their 16-bit work cause we'd done a stand-alone BASIC that had its own file system and we were running that on various 8086 machines. That previous June -- we had gone with Seattle Computer Products and shown the standalone 8086 BASIC at the National Computer Conference.

So Gary said OK, he'd meet with Jack Sams. On a lot of these CP/M deals, we had done the adaptation for the machine... because Digital Research just didn't do that. So we talked about were they serious about the 8086. That got hung up. The IBM guys flew down there and they couldn't get the non-disclosure signed. Because IBM non-disclosures are pretty unreasonable. It's very one-sided. And we just went ahead and signed the thing. But they didn't.

That's when we said, "Boy, actually it'd be a real mistake for us to do 80 percent of the software but not do this DOS piece, so we should find a way to do the DOS piece." And that's when Paul Allen and Steve Ballmer closed the deal to get the work that Tim Patterson had done, which at the time was called QDOS, and we hired Tim Patterson. So Tim is the creator of MS-DOS; working with some people at Microsoft he created MS-DOS version 1.

Subsequently, Digital Research woke up to the fact that this was a pretty important project and convinced IBM to also offer their product. But they priced it very high and because they came in really late, some of the applications IBM had gotten were on MS-DOS.

So then there was about a two-year competition between us. I was trying to get more applications to be done on MS-DOS and CP/M-86. I would chart every issue of PC Magazine; I'd look at all the ads to see how many for our program would say just CP/M-86, how many would say just MS-DOS, and many of them would say both.

This evangelization was born as part of this competition. A major coup was that I got to know Mitch Kapor [of Lotus], and they decided not to put Lotus 1-2-3 on CP/M-86. So that's how we convinced DEC. They had been about to do a CP/M only machine, and we convinced them they needed to offer MS-DOS. So there were many twists and turns and it took about two or three years before people could see clearly that MS-DOS was the primary operating system.

There was even the UCSD P-system, which was the only operating system IBM licensed to put on the Display Writer as well which had a pretty big installed base. They paid those Soft Tech guys a huge amount of money -- like 20 million, which back in those days was an unbelievable amount of money -- and we didn't get paid nearly that. But we retained the rights to do the licensing to people doing compatible machines. That was the key, the key issue on our contract, that they wouldn't be able to license the other people, that only we could do that.

Succeeding at Software

PC MAGAZINE: So many of the major companies of the early 80's aren't around any more. Was it the law of averages? Or was it some failure of vision? Or was it just the presence that you and some of the other larger companies had?

BILL GATES: At any milestone in our industry, you look at the companies still around, and the turnover rate is pretty unbelievable. You look at the early PC hardware companies; most are gone. Take the guys who did 8-bit machines: IMSAI, MITS, Processor Technology -- they disappeared -- Northstar. Then there was a wave of PC guys: Columbia, Dyna, Mindset. The same in the software field. Look at the first issue of PC Magazine and 90 percent of those software companies aren't there.

But it's not unique to the PC; look at the first issues of Macworld magazine and 90 percent of those companies aren't around. Take the Internet, now. The early pioneers on the Internet- there's already quite a bit of a turnover and it's still early days. When you get these highly uncertain markets and everybody's sort of rushing in, you get very few big successes and lots of people who don't end up with enough customers to make it work.

In our case, we had a focus on doing software and a pretty long-term approach to how we wanted to build up our software expertise. The vision that software would be a key element of unlocking the power of the machine turned out to be right. So I'd say there's a lot of things: there's the focus we had, the approach we had, certainly an element of luck in terms of who we were able to hire. We had a clear sense of key design wins. How important it was, whether it was IBM or Japan or large corporate users, that you really could test yourself by saying, "look, if this guy doesn't buy, it must be your product's not good enough, so just keep making it better."

Microsoft's Image

PC MAGAZINE: There are a fair number of people in the industry who don't particularly like Microsoft. There's an image in the industry about Microsoft. I'm sure you hear at least as much, if not more than, I do. Why do you think that's there? Do you think there's an element of truth that Microsoft is sometimes is a little pushy?

BILL GATES: If you survey the software industry, which we do quite a bit, most of the companies build on our platform. And most of them-because of the technical support we give with our Developer's Network, and the fact we've created this standard platform for them to target-most software developers love Microsoft. There's a few large guys who are very competitive with us, or intimidated by our success or the breadth of our product line, and would love to say anything that could slow us down.

But the reason there are so many successful companies is because of the framework and standards that we created and the support that we put behind that, and that's why we see in the breadth of software companies, incredibly positive attitudes towards Microsoft.

This is an industry when people come in to do things-you know, how do you internationalize software, how do you build an international distribution network, how do you avoid piracy being a problem. A lot of the investments we've made are for the success of the industry. Name a country; we were the first people to bring a lawsuit about software copying, and make sure the distribution structure went into those countries, and helped other software developers get into those countries to help grow the market for us and the industry as a whole. It's a very positive story.

In computing in the last 20 years, think of who's been successful. It's Microsoft's partners who have been successful. And there are no exceptions. You know, Intel and Microsoft working together: there's no story of two companies who've been able to share ideas and help grow the market and each do so well.

And I'd say the same thing about Compaq, and our partners in distribution-tens-of-thousands of small companies that come in and put these systems together and build solutions. We have competitors and they're there.

Competition

PC MAGAZINE: Lots of software companies are building on your platform but still competitive with some part or another of your strategy.

OS/2 or Windows

PC MAGAZINE: One of the big issues is about the OS/2/Windows era. You were working on Windows and OS/2 but you were really beating the drum for OS/2 Presentation Manager. Anyway, OS/2 didn't take off and Windows did. Do you think there was a way, in retrospect, of better communicating to the developers your thinking about OS/2 when there was a doubt? Because so many feel-or felt at the time, at least -- that they had been misled.

BILL GATES: It's nonsense. I want to meet the guy who can say that to me. We were out pushing them to do Windows applications and we were clear as day exactly what Microsoft was doing. We were the only developer who explained exactly when we were going to release Windows versions of our applications and when we were going to do OS/2 versions.

Now it turns out we did the first OS/2 spreadsheet, which was OS/2 Excel. We did the first OS/2 word processor, which was OS/2 Word. And we were the ones who got in there and supported OS/2. But we were clear every step of the way that those versions would come out after the Windows versions. The big issue here is that the other developers in productivity tools didn't believe in a graphical user interface. And so what happened on Mac, what happened on Windows, what happened on OS/2 was Microsoft dominated all three. Of the OS/2 productivity applications ever sold, we sold over 90 percent of them. Now, compared to the Windows number, it's a tiny little number. But, that was our success. Nothing unusual there.

People thought there was one operating system that just Microsoft are backing, there's one that Microsoft and IBM are backing. In those days they thought that meant the one IBM was involved in had a better chance of succeeding. Now, based on that situation, they view that you've guaranteed that would not be successful. And they just weren't serious about graphical interface. When they finally did get their OS/2 applications out, which was very, very late, they just weren't competitive. Just like the Macintosh applications; they were not competitive. We have a higher market share on Mac then we have on the Windows platform, because of the early days where they just didn't care about it.

Apple's New Platform

PC MAGAZINE: Let's talk about Apple for a minute here, talk about the future in terms of where it's going. What do you think of about Apple in the future of handling the NextStep adoption? Will that create new opportunities for Microsoft on that platform? How will that impact how Microsoft develops for that platform?

BILL GATES: I was just meeting with Gil Amelio and Steve Jobs yesterday. Steve was saying, "You took a chance on us once and it worked well." And I said, "You're absolutely right, Steve. And then the next time you asked me to do it, I didn't. And that worked out pretty well, too!" And so we were just joking around about what the history had been. Apple did a lot for the industry in taking the Xerox graphics work and creating a volume product that was a great implementation of the graphical interface. And Microsoft was the software company there from the beginning.

We continue have a lot of resources focused on Macintosh and doing new things for our customers. We did a lot of new neat browser work, we're hard at work moving Office 97 over to the Macintosh. So, right now we're focused on System 7 and new deliveries there. In terms of this Rhapsody, which is where they take the Next Step product, or Open Step product, and put that on their hardware, we're going to take a hard look at that. That's why we're meeting with Apple and Apple's doing a good job reaching out to us.

You'll see in the press that we've been very supportive of Apple. Our applications made it possible to have a mixed environment, because Excel on the Mac and Excel on Windows, the interface was very similar and you could exchange data files. That allowed for a lot of coexistence so the Mac could take it's strengths in things like publishing and move into corporate environments.

Top Ten Technologies

PC MAGAZINE: As part of our anniversary issue, one of the things we're doing is picking ten technologies we think will have the biggest impact on the industry in the next 15 years. What would you say are the technologies you expect to have the most impact over the next 15 years?

BILL GATES: Speech Recognition. Natural Language Understanding, Automatic Learning, Flat Screen Displays, Optic Fiber. Those are the key ones. I'd say those are the key technologies.

PC MAGAZINE: Fifteen years from now will there still be a concept of the PC? Or will PC technology be in so many different devices that people won't even think about it?

BILL GATES:

We'll certainly have a device with a large enough screen that you sit down and read information and create information on that device.

Then, anywhere you go, if you find a screen, you can just log in and your personal environment will be there. If you're in a waiting room in an office, they've got a little flat screen there you can just pick it up, log on and use it. You won't necessarily always have to take hardware with you, because the network will retain the things that you care about.

Computing will be ubiquitous so that you'll have the pocket device, the wall-size device, and depending on how you set things up, you'll be able to talk to the computer when you're not really paying attention to it. So the term PC will call up a different image than it does right now.

But what you thought of 15 years ago when you said the term PC and what you think of now are vastly different. These sleek little notebooks that people carry around with so much power are just so radically different from the Compaq sewing machine that's less than 15 years old.


From the Archives: Gordon Moore
March 25, 1997

OnlineExtraOnline Extra

(Editors' Note: This article originally appeared in the March 25, 1997 issue of PC Magazine.)

Developing the Integrated Circuit

PC MAGAZINE: What was it like being at Shockley Semiconductor in 1956 and 1957? Did you guys know what you were building?

GORDON MOORE: Well, we knew what Shockley's original goal was, which was to make a silicon transistor, and he changed his mind in the middle of things and decided to make much more obscure device--a four-layer diode. But we were mainly just developing the technology that was necessary to make a device, and we never really developed a product or even defined one in the time I was at Shockley. We got some at least early grounding in silicon and in the kinds of problems that had to be solved in order to make anything useful, and that gave us a sense of direction when we set up Fairchild.

PC MAGAZINE: And so you set up Fairchild specifically to do the integrated circuit?

GORDON MOORE: No, no, the integrated circuit came later. It was still to do a double-diffused silicon transistor, something that Bell Labs had built in the laboratory, but no one was making in production. And Fairchild took on the job of trying to make these a manufacturable device. There was a lot of technology that had to be developed in order to get to that point. And after we got our first few transistors out... our first two transistors were the "mesa" structure. We actually had a little mesa-shaped area on the top of the silicon that contained the active parts of the transistor. But it had all of its sensitive surfaces exposed to the ambient. One of the fellows at Fairchild came up with the idea for the planar transistor, where you took advantage of the fact that the junctions actually diffused in under the silicon oxide. He proposed leaving the oxide over these junctions as a protection from the crud that got on the sides of the mesas.

PC MAGAZINE: That was Jean Hoerni?

GORDON MOORE: That was Jean Hoerni. And we couldn't even try it for a while, because it required four masking steps. We were developing the first lithography to use in manufacturing, and the approach we took only let us make three masks that were indexed, so it had to wait until we got our first couple of products into manufacturing before we could even try that. And it turned out to work far better than anybody imagined; it really did take care of one of the major problems that we had before. But more than that, it was really the path to let us make a practical integrated circuit.

So when the idea of the integrated circuit was being kicked around, you know Jack Kilby [of Texas Instruments] made, again, a laboratory sample of the integrated circuit. Bob Noyce saw how he could take the planar technology and extend it to isolate the transistors from one another electrically and connect them to make the circuit by putting metal over the top of this silicon oxide that was covering the junction. So adding several more steps to the planar transistor, we actually made the first practical integrated circuits, and got those to the market in the early '60s; I think it was either 1961 or 1962 that we shipped the first commercial integrated circuits.

PC MAGAZINE: So you all knew about what Kilby was doing at that point?

GORDON MOORE: Yeah, we did, yeah. And Kilby certainly had the idea and built the demonstration model of an integrated circuit, but his still had wire bonds in it and it had etched regions to make resistors; it had none of the elegance that Noyce's invention had. Now Noyce and Kilby are often described as co-inventors of the transistor; actually in my view their contributions are completely different. Kilby showed that by hook or by crook you could build something that had some useful properties. Noyce took the planar idea and extended it to show how to do it practically.

PC MAGAZINE: When he did it, was it clear how momentous this was?

GORDON MOORE: Of course not!

PC MAGAZINE: Did you have any idea that this would build a billion-dollar company?

GORDON MOORE: No, no, no. You've got to recognize at that time the industry was small compared with a billion dollars. And we had no idea how important it was. And in fact it wasn't something that the world beat a path to our door to buy. Our customers were typically the circuit designers in the large system companies. When you walked into the circuit designer and say "Hey, I've got something that's going to put you out of business," it wasn't very readily accepted. And they had all kinds of arguments why this was a bad idea. You couldn't test the components individually, so you could never figure out if this was going to be reliable. The resistors we used were plus or minus 30 percent, had terrible temperature coefficients. "We would never use a resistor like that in a circuit" -- all kinds of arguments.

And then Noyce made one other major contribution. He said, "OK, we'll sell it to you for less than you can buy the components to build it yourself." All of a sudden integrated circuits started to be something that were useful. And you know that's been really the big advantage that integrated circuits have brought. They do a lot of other things, but what they really do is lower the cost of electronics. You know that put us on a whole major new path, but still it was several years before it got a lot of traction and really became a big movement.

Moore's Law

PC MAGAZINE: While you were you know still at Fairchild, you wrote a paper in which "Moore's Law" was mentioned. Of course you didn't call it Moore's Law at that point.

GORDON MOORE: No, certainly not.

PC MAGAZINE: But you were talking about the doubling of transistors at a predictable rate, at that point I think it was once every. . .

GORDON MOORE: . . .every year, at that point.

PC MAGAZINE: Later it became once every 18 months.

GORDON MOORE: Well, I really said every two years. Well, that was the fairly early days of integrated circuits. The most complicated chip around was still a laboratory model at Fairchild at that time, had about 60 components; that was transistors plus resistors there. And I was writing an article for the 35th anniversary edition of another magazine--Electronics--and they wanted me to predict the future of semiconductor components for the next 10 years. And I was convinced the integrated circuit was going to be important, because you could begin to see this real impact on lowering the cost, so I looked back. We'd been about doubling every year since actually since the first planar transistor--I call that Year Zero, in 1959, with one transistor. We'd gotten up to 64 in six years in '65, so I said aha, it's been doubling every year. I just said 'okay it's going to continue to do that for 10 years.' So I extrapolated a factor of a thousand increase in the complexity of circuits, you know not expecting any real accuracy, but wanting to get this idea of the way the components were going to be used...For that ten years we followed that doubling every year really quite precisely.

Somebody else dubbed it Moore's Law -- I think it was Carver Mead from CalTech. He tends to do things like that.

Founding Intel

PC MAGAZINE: You left Fairchild in '68 to found Intel, you and Noyce. Can you tell us why you left, and what you wanted Intel to be back then?

GORDON MOORE: Well, these are two different questions. Why we left is, Fairchild, the Semiconductor Division at that time was a West Coast operation, it was kind of, this tail that was wagging the dog, where the dog was Fairchild Camera and Instrument, which was an East-coast-based company. Fairchild Camera and Instrument actually went through two CEOs within six months and was running the company with a three-man committee of the board of directors while they were looking on the outside for somebody else to run it. Noyce was the logical internal candidate and he was clearly being passed over, which made him relatively unhappy, so he decided he was going to leave. I was running the laboratory, and frustrated with the increasing difficulty of moving things from the laboratory into production, and I could see that if they were going outside to hire somebody the company was going to change rather significantly. So I thought I'd better leave before it happened rather than after, so they could do whatever they want with the structure. So we both left and then decided that we would set up a company and look at a different way of getting leverage in the semiconductor industry.

At that time it was easy, you could build something more complex than you could define as a useful product. As you made a complex circuit it tended to become unique, only used once in a computer system or something like that, and the design costs just ate you up on that kind of a product. So while you could build them, there were no products that it made economic sense to do. We thought we saw in semiconductor memory an opportunity to build a general-purpose function of essentially arbitrary complexity that could shoot at a fairly big established market, replacing cores in particular. So we thought we could start a new company, pursue semiconductor memory as a first example of a product that used a very complex circuit made in large volume, and that we could do that without directly competing with the established semiconductor companies.

And we chose new technology that we thought would be specifically appropriate for memory--actually I called this our "Goldilocks" strategy in retrospect. We chose three technologies. One was a Schottky bipolar technology, and our first product was actually a 64-bit Schottky bipolar memory. We chose silicon gate MOS. Up to that time silicon gate had been demonstrated in individual devices, but it had never been put into production, and it had a lot of attractive features. And then we were going to make a multi-chip assembly of these things, so we could pack a lot of memory into a small area. It's the Goldilocks strategy because it turns out that the bipolar was too easy--it worked so well that it was easy for the established companies to copy it. There weren't any tough technological problems, so we didn't have a long-term advantage there. The multi-chip thing was too hard--we still don't do it, at least cost-effectively--but the silicon gate was just right. While we were focused on it, we put all of our energy into solving the few really tough technical problems to make it a production technology. We got by them pretty easily, but it was hard enough that companies that wanted to get into the business didn't put the focus on it--you know the big companies never put their 10 best people on one project like that--so it was something like seven years before the established companies got around to figuring out how to do silicon gate MOS.

So we had a long time there where we had an opportunity to expand in what turns out to be the mainstream of the technology, without much direct competition. So it worked out just right. It's the kind of thing, you know, you can't plan something like that, we just hit a technology that was just at the right degree of difficulty to give a startup a really big kick. And of course also at that time, after we got the first few memory products out, we did this calculator project where we changed the design from a bunch of custom circuits--which we couldn't begin to handle with our limited engineering resources--to the idea of a microprocessor, Ted Hoff's suggestion. And we saw we could do a lot of other things, too. So that was kind of our next example, beyond memory, of the complex semiconductor devices you could build in large volume, because by programming you could do a whole bunch of different functions with it.

Memory

PC MAGAZINE: Let's go back to memory just a little bit, and then we'll go back to microprocessors. You, or Intel, invented SRAM and later DRAM, which was, you know, your major product for a long period of time, right?

GORDON MOORE: Well, we brought the first ones to market. The idea that you could use a, you know, a flip-flop as a memory cell I guess had been around for a while, and IBM had actually used it some in their mainframes--high-speed bipolar scratchpads. We were the first ones to bring it out as a commercial product. The DRAM is a little fuzzier in its origin, but there again we were the first ones to bring it to market, but I think the idea of the DRAM actually came someplace else. The actual implementation of it we did, was an Intel invention. There were other ideas around of course. And then the EPROM, which was the third of this series of early memories, was clearly an Intel invention. So we had really our, the three main branches of the memory business in our first few years.

PC MAGAZINE: For a long time you were a memory company, and then comes, you know, the mid-'80s and suddenly all memory is done elsewhere, I mean, for a while it was almost all done overseas--there's still, there's a little done in the U.S.

GORDON MOORE: Well, we still do a moderate amount of memory, flash these days principally, some static RAM that we have to use for caches for our products, but it's a smaller part of the business. We were following, we were pursuing our memory businesses and also trying to develop the microprocessor business, all through the '70s and '80s. Like the integrated circuit it took some while to get traction. The microprocessor took a while to become a large-volume product also. For many years we did more revenue in development systems for microprocessors than we did in the chips themselves. You know we used to sell what we call "blue boxes," really special-purpose computers for designing and debugging the hardware and the software using microprocessors--that was a pretty good business for us. We always hoped that each one of these development systems we sold would eventually be a whole bunch of component business. And I guess it finally turned out to be the case. Of course, in those days the sales were essentially all to what we call embedded control applications now, and that was really the kind of applications that Ted Hoff saw for the microprocessor when he first came up with idea.

Microprocessors

PC MAGAZINE: When the microprocessor started to get used in personal computers, '75 and then on out, did you have any idea of how much, you know prevalent these things would be? Did you think these things were going be everywhere sort of the way they are?

GORDON MOORE: Certainly I didn't, and if anybody else around here did they kept it to themselves. You know the first MITS--I guess you'd call it a PC now--the Altair, it was just a hobby device where the inputs were toggle switches and the outputs were LEDs. You could demonstrate the way a computer worked, but a tough way to do any practical computing. And I even turned down the idea of a home computer in that time period sometime. One of our engineers came with the idea that you could build a computer and you could put it in the home, and I kind of asked him what it was good for, and the only application I got back was that the housewife could put her recipes on it. I could imagine my wife sitting there with a computer by the stove... it didn't really look very practical.

In fact, even when Steve Jobs came over and showed us what was going on at Apple, you know I viewed it as just, just one more of the hundreds of applications that existed for microprocessors, and didn't appreciate that it was a significant new direction.

The IBM PC

PC MAGAZINE: What about when, you know, go out another three years or so, it's 1980. IBM is around and they're looking a microprocessor for what would become the PC, and obviously you guys sold them the 8088. What, you know, what was going through your head at that point--at that point was that an important, by that point had it become important, was this an important sell?

GORDON MOORE: IBM was always an important customer, or potential customer, just because of their sheer size. But I put that as a somewhat different perspective. That was a period of time when we were competing very strongly with Motorola for design wins. They had a good part out on the market, and we were concerned that they were going to run away with a whole generation of designs, so we put together a very aggressive marketing program, where we wanted to get 2,000 design wins. I think it was in the '79-'80 time period, over something like a year span, that time, and went aggressively out to pitch every design we could. Well one of those designs happened to be the IBM PC. In fact the salesman that was calling on the account wasn't even told what the project was--it was, you know, cover the conference room with a blanket in between and ask questions.You would answer the questions but never see what the product being worked on was, and of course eventually we found out it was the PC. But even IBM expected only to sell a few hundred thousand of them over the lifetime of the product. And while we thought it was a significant design win, it was one of something like 3,000 that we had in that time. And I certainly didn't realize that it was going to be the future of Intel, even when the IBM PC got announced.

PC MAGAZINE: Now even the 8086 design was done fairly quickly wasn't it? I mean you were working on another project at the time.

GORDON MOORE: Well, we were -- we had a very aggressive project to make the ultimate microprocessor. Not quite, but after we got the 8080 to market we embarked on a program saying 'okay, we've done enough, we've got a pretty good idea how these things are, we got one more chance to do it right, so throw away everything you've, got start with a clean sheet of paper and develop the right microprocessor. And we put essentially every idea known to computer science into it. It implemented objects in hardware, but it had so much in it that with the technology of the day, by the time we built it, we had to make so many sacrifices to keep the functionality, it was as slow as a dog. So actually while it became useful for teaching computer science in universities, it didn't really get into any systems. But one of our guys really beat on us that we really needed a 16-bit extension, and we set up another project which became the 8086 and the 8088, so we did do those relatively rapidly and were able to get the products to market. Fortunately we had them at the right time at the right place.

Becoming A Microprocessor Company

PC MAGAZINE: In the mid '80s, just as the microprocessor business started to happen and there were really PCs, Intel went through some pretty tough times.

GORDON MOORE: Well,the real tough time was when the world fell apart in the middle of '84, down into '85 and '86. It was kind of euphoria followed by disaster. In the '83, first-part-of-'84 time period it didn't look like we'd ever catch up with demand. That's when we had the 80286, we'd just introduced the 8051 microcontroller, and we had an 80186 also that you may or may not remember. And this was a whole generation of products, and it looked like the world needed three times as many as we could possibly produce. So we set up multiple sources for all of these, partially because our customers at that time insisted on it, partially because we couldn't, we just couldn't meet the demand. And then in the middle of '84, demand collapsed; in fact our customers needed about a third as many as they said they were gong to need, so the industry had just built capacity as fast as it could -- Intel included -- and we went into one of these periods when prices collapsed. In fact, in that time I think our most profitable product was one of the EPROMs.

Anyhow, in nine months it went from something over 30 dollars to something less than 3, so a 90 percent price drop in nine months.You know the unit volume keeps growing, but the revenue just drops off a cliff under those circumstances, so that was what hit the industry. It was mostly in DRAMs. The U.S. industry lost an estimated billion dollars--the Japanese lost over two--mostly on DRAMs in that period. That's when we bailed out, and just decided that DRAMs, were something where we couldn't see getting a return. That was a very fortunate decision. We did that... before that we had actually three development sites. We had Oregon doing DRAMs, Livermore doing SRAMs, and here doing EPROMs, and the logic technology kind of used the SRAM technology thinking they were pretty much the same. After we bailed out of the DRAM and essentially the SRAM, we focused virtually all of our technology development on stuff for making microprocessors, and that was when we got really into multi-layer metals, our first two-layer-metal process -- the 80386 was the first product on that. That was a major change in direction, really I think it served us very well. If we hadn't gotten out of the DRAMs, we'd have had a lot of trouble ever putting the technology effort involved on the microprocessors that they needed.

PC MAGAZINE: Obviously the 386 and 486 were wildly successful in their day--the 486 almost until now. I mean, at that point, really this became a microprocessor company.

GORDON MOORE: We really did. We'd built a pretty good business at the 286 level, by that time we appreciated that the PC was a very important segment of the business. The 386 was different than previous generations in that we didn't have to give the design to a second source in order to get it used. Kind of the. . . up until that time, semiconductor companies had a reputation of losing the process occasionally--you know, they wouldn't be able to make anything for a couple of months or something. Customers really insisted upon multiple sources in order to design in your product, so with the 286 for example, we set up AMD, we set up Siemens, we set up Fujistu. We transferred them the masks essentially for nothing in order to give our customers the multiple sources they needed to use them. The 386--we'd just been through this business of losing the profits of that whole generation of products because of the overcapacity that came in the '84-'85 time period--and kind of decided, heck, if we can't make everything our customers need this time around they'll be short some product. We thought there was enough of a software continuity that a software-compatible processor would be used even if it was a sole source. We blasted ahead as a sole source supplier of the 386. You know we didn't officially set up any second sources; other people obviously came along later and took a portion of the market, but that was I think another major change in the way the industry developed./

Risc Processors

PC MAGAZINE: In the late '80s, early '90s, there was a lot of talk about RISC processors, and obviously a lot came to market and things like that. You obviously developed a few RISC processors too--the 860, the 960, and things like that. Were you worried that, you know, basically the x86 processor was sort of older technology and this was going to somehow surpass it as everybody was sort of talking about at that point? Or were you confident all along that you'd be able to keep up?

GORDON MOORE: You know, we considered it principally marketing hype, but that was a tough sell, trying to convince people that RISC was marketing, not technology. There were some things you could recognize. If you threw everything away and start with a clean sheet of paper again, which was what the RISC people got to do, you could avoid making some of the mistakes that were done in some of the early architectures: put in more registers, make all the instructions the same length, things like that. But we had this tremendous advantage of all of the software people had bought that ran on our instruction set, and we were convinced that was a very important advantage.

Everything except throwing away some of the warts and things on the existing architecture that came along we could take advantage of. We could make superscalars, we could pipeline, we could do all of these things that were getting the performance. It may cost us some more transistors, but as long as we had a lot higher unit volume than any of the RISC processors, we could afford to put more transistors in to maintain the compatibility. For a long time we had more transistors but typically in about the same areas because our technology, our design technology in particular, was designed for making a very high package density. Now it's gotten to a point where some of the RISC processors have more transistors than we have on ours, because they've decided they want to add a lot of these other features in there. We were always concerned, you know Andy [Grove]'s book "Only the Paranoid Survive," we're survivors by that definition. We look at all of these things as very serious threats and respond to them. And I'm sure the rate at which the x86 family evolved was influenced considerably by the pressure we were getting from the other approaches that were being pursued. That was the main thing. You know we knew enough about RISC processors--obviously we designed them--but we really thought that the advantages of the software continuity in the market we were in was much more important. Now if we'd been focusing on engineering workstations we might've taken a completely different tack on that, because there there wasn't nearly the continuity from one generation of software to another. That's why that was the easiest target for the RISC processors.

Future Architectures

PC MAGAZINE: In recent years, Intel has gotten into many things beyond the processor itself. It's got an architecture labs, it's trying to define PC architecture, you've done a lot of networking stuff, you've done a lot of chip set and motherboard design. Where do you see this all going long term?

GORDON MOORE: Well, first of all, we see a need for some kind of central architectural control, of the way the whole platform evolves. Hopefully something that will leave a lot of room for the individual players to innovate, but something that will allow us to really keep the compatibility--really one of our principal advantages--together, but also removing performance limitations as we see them come along.

We added things like the PCI bus--you know the bus was starting to be a limit--and it was a giant leap in bus performance. Now we've got the universal serial bus, which will make it very much easier to add things into the PC. Ease of use is one of the very important things we have to pursue. We think Intel is in the best position to give guidance for those kinds of things. You may have seen I guess it was today [January 7, 1997] that Microsoft, Toshiba, and Intel announced the architecture for power management essentially, the best architecture for that. Again, a place where I think we took a leadership role, brought in the major players, and together we came up with a kind of an open standard for this. But we think that we play a very important role there in trying to herd this bunch of cats that are really making the final systems. So I see us, you know, expanding our role there and trying to get a uniform kernel that will maintain the compatibility while still trying to leave enough room for our customers to innovate.

Speed and Power

PC MAGAZINE: Let's talk a little bit some future designs, RISC and things like that. I mean, there's been a lot of questions about microprocessor design with all this stuff, and I'd just like to sort of go through some the techniques that people are talking about in terms of design and sort of what you think about them. I mean obviously wants faster clock speeds; there's a bit of a tradeoff on power usage...

GORDON MOORE: A bit!! A lot. . .

PC MAGAZINE: A lot on power usage. Okay, of course. . .

GORDON MOORE: Well, that's one of the easiest ways to get higher performance. We have really tapped most of the ideas that have come along straight on architecture to try to get performance. So I guess there's still more that could be done with some kind of multithreading, but that requires that the software be put together to take maximum advantage of it. And you know people will make bigger caches and deeper pipelines and all that kind of thing, as much as we can architecturally, but the brute force way of speeding it up is going to higher clock frequencies. And I think we'll see that be a very important part of improved the performance over the next several years... I expect to see gigahertz clocks in the not-very-distant future.

Cooling them is an interesting question--as the clock speed goes up the heat goes up, which means the voltage probably has to go down. Fortunately the power goes down as the square of the voltage, but if we get below about 1 volt, I get uncomfortable that you've really got something you can design and build--although there's no theoretical reason you can't design quite a bit lower than that. But you know you take a 50-watt device operating at 1 volt, you've got to get 50 amps around there with millivolt voltage drops--you'd have big copper bus bars coming into the chip somehow to distribute the power. It gets to be a very challenging deal, so there's a lot of good engineering that has to be done to handle the power, clock frequency, and increasing complexity simultaneously.

Instruction Sets

PC MAGAZINE: Basically since the 386 the architecture of the basic chip hasn't changed that much--I mean, the instruction set, until MMX, didn't change at all.

GORDON MOORE: The instruction set has stayed the same, but we've added a lot of other things. The 486 put the floating point on there, you know you get all the pipelining and superscalar and that kind of stuff has come on. We call that part of the architecture, even though the instruction set stays the same. MMX of course is a change in that.

PC MAGAZINE: Obviously now you're talking about VLIW and 64-bit stuff.

GORDON MOORE: We're not talking about that at all. I mean, we're doing a 64-bit program but we're not describing what it is.

PC MAGAZINE: Does the architecture, I mean does the instruction set have to change when all this stuff happens?

GORDON MOORE: Oh yeah, sure, 64 bits means new instructions. But it will still run the older software compatibly. You know, that's one thing we have, is the idea of carrying a compatible family along--even if we have to put two processors on the chip, one 32-bit and one 64-bit, it's going to run that old software effectively.

PC MAGAZINE: Speaking about putting two processors on a chip, one of the things people are talking about is whether you go with a single processor on a chip or long-term you end up putting more parallelism within the chip itself. What do you think?

GORDON MOORE: Well, as you put more and more transistors on, the easiest way to use it is with more parallelism. The problem is that the software has to be written to take advantage of it. While we seem to be able to keep two of them pretty well fed, my understanding --and this gets out of my area of competence-- is that the software that's out there now won't take real advantage of say four or eight processors in a chip and you have to do something different. So I don't know quite how that's going to evolve.

Manufacturing Limits

PC MAGAZINE: Let's talk about manufacturing and all that stuff and when you do the Moore's Law charts and you have all of this wonderful doubling every 18 months or whatever. How long do you see that going on?

GORDON MOORE: Until I retire!

PC MAGAZINE: When are you retiring?

GORDON MOORE: Well, I was 68 last week [the week of January 1, 1997], so I don't know which will drive what. [The following week Intel announced that Andy Grove will be replacing Gordon Moore as chairman, and Moore will become chairman emeritus.] Actually, there's still quite a bit we can squeeze out of the technology, and I'm amazed at how effectively people have been able to continue spinning the next versions. There's kind of been a generation of technology every three years, where essentially we double the density every three years. The minimum dimension multiplies by about .7; .7 squared is .49; you get the same thing in half the area. And now, for Intel, our leading production technology now is .35-micron and .25 is moving into production as soon as we can get it cleaned up. That'll be the workhorse in a couple of years. And that's still straight optical lithography. .18, which is kind of the next step, looks like it can be done optically without any dramatic changes in what we're doing. The step after that to me looks a little tougher but the guys that have to do it don't seem to be intimidated by it at the moment. The reason is that we'll probably do it with the so-called 193-nanometer light source, the excimer laser, and to do .18 with that, the wavelength and the minimum dimension are about the same optically that's not too bad. To do .13 in the 193 is a much tougher deal, because you're operating quite a bit below the wavelength. So you probably have to do all of the tricks available--you know, phase shift masks and this kind of stuff, multi-layer resists--the ideas that exist in the industry but are not a lot of fun. You know they're not that well developed yet, so there may be another light source that lets us operate at a shorter wavelength. The trouble is, if you go to a shorter wavelength, you essentially run out of transparent materials, so all the optics have to be reflecting. At 193 you've still got fluoride and a few silicas that are transparent, but if you get a wavelength around that .13, .12, I don't know, I don't think anything is transparent, so you really get into a different regime. Things like masks: We do masks now on fused silica, we always shine the light through the mask. We can do that at .18 with the 193-nanometer light, but if we go to .12 we've got to reflect the light off the mask instead. So changing wavelength is not easy there, and I don't know which is going to be more difficult--to use the 193 at two-thirds of the wavelength or to, you know, come up with an all-reflective system and a new light source. There's a lot of work to do, and fortunately we've got several years to do it. As long as we've got all the research in place to get it done now. Beyond that, I guess when we get to an all reflective system then we can go quite a bit further. But we make a change from the kind of lithography we've been doing for the last dozen years when we abandon transparent optical materials. That's a change, and it's going to be an expensive change at least. Then, you eventually get to some kind of a physical limit, and the industry has argued where that is for some time. I think the consensus is it's someplace between .05 and .1 micron minimum dimension, and like most of these limits it keeps pushing further away as we get closer. We're getting down to the point essentially where the atomic nature of matter starts to be a real limit when you get down there. That's a fairly fundamental limit. So that carries us well into the next century, and at that time we'll be able to put, I don't know, several hundred million or a billion transistors on a logic chip. That leaves phenomenal room for the designers to innovate in how they're going to use those, so I don't see this as being the stopping of innovation in the industry or anything, I just see it focusing more innovation in other directions, so things will advance for a long time.

PC MAGAZINE: Are physical limits the constraint there or is it really just the cost of building all these fabs? I mean obviously fabs are very expensive these days, and getting more so.

GORDON MOORE: Well, the atomic nature of matter is really a physical limit. The devices start behaving differently. The leakage currents get up, and when the leakage currents get comparable to your signal currents, you're really in trouble. And that happens someplace in that range. And I have a feeling that it may bite you a bit before that, statistically. You know we depend on being able to dope semiconductors, for example, by putting impurity atoms in. You make everything smaller, the number of impurity atoms in the active part of the device is dropping and dropping and dropping. And if you assume they're randomly distributed -- and that's the model people usually use--you're going to get fairly significant fluctuations in those. And if you expect a circuit to have a billion transistors that all behave properly, then that's the 8 or 10 sigma [standard deviations from the mean] in the distribution. You might get to the point where just statistically you have a few transistors don't work that's all it takes of course to wipe out one of these things. So we may actually get bitten statistically a ways away from this limit people are looking at. I think research is going on on that, I'm not up to date exactly on what's been done. But it could be an intriguing problem. We certainly haven't seen any evidence of it yet. That's been an amazing thing about this technology, I mean, usually something comes up and bites you when you weren't expecting it to. The only place that's happened so far is the soft error problem in DRAMs. And everything else is just working beautifully.

The Pentium Bug

PC MAGAZINE: You had a problem with the Pentium processor a few years back. I mean. . . that was a different kind of problem.

GORDON MOORE: Okay, that was just a straight design error; somebody forgot to put in a [lookup table entry.]

PC MAGAZINE: But as processors get more complex, they get a billion transistors on them, isn't checking all that stuff, doesn't it get harder and harder to do?

GORDON MOORE: Well, over the last few generations, in spite of that one problem, our ability to check has improved faster than the complexity. You know -- in some ways, the 286, it took us forever to get it right. Now we come out with a thing like MMX in the new processor, and we've had a lot of those out to a lot of people, and no problem at all. We've learned a lot along the way, we have phenomenal computer aids to do this stuff. It's a lot of work: you know, the compatibility and verification effort is about the same size as the design effort, and they go on at the same time. So it's a big deal, but it's something that we're capable of doing.

Gallium Arsenide

PC MAGAZINE: On the design end you know there's a lot of people that talk about a lot of very different kinds of ways of making semiconductors, whether it be gallium arsenide or multi-state logic, or all of those things. Which, if any, of those high-end stuff do you think are really important things?

GORDON MOORE: Well, somebody once said, "Gallium arsenide is the material of the future and always will be..." I spent a lot of money on gallium arsenide in the '60s and got convinced it was a fairly intractable material. It's neat for the front-end of cellular phones when you want a little bit of stuff that's high-performance, but it's not going to compete in the mainstream. You know we now do everything on 8-inch wafers--nobody's seen an 8-inch wafer of gallium arsenide. We'll be at 12-inch wafers probably by the end of the century. People can't grow crystals that big in any of these other materials! So I think the mainstream is likely to be silicon-based forever. But I don't see that as the critical thing any more, you know, I view the technology as a general-purpose method of making complex structures and materials layer by layer. And it doesn't make too much difference what those materials are--and they can even be used for a lot of things other than electronics. You've got all of these micromechanical machines that people are demonstrating now that are really interesting. They use the same technology, building gears and wheels and motors and sensors. You've got people building little chemical laboratories: I know a company that makes a blood analysis chip that in 90 seconds, you put one drop of blood on the chip, plug it into this machine, and in 90 seconds it'll give you an analysis of six of the major constituents of your blood, with the precision you can get sending it to a laboratory. And it has all the reagents in there and everything, all the sensors. A lot of things are being made out of this technology now, and to me it's just a general-purpose technology. I call it as fundamental to the information age as metalworking was to the Industrial Revolution. So I don't care, if gallium arsenide sapphire turns out to be the material that we ought to make our fast computers out of, except for the problems of growing big sapphire and the like -- they give you some trouble -- the basic technology will be kind of 90 percent the same. You've got to change the 10 percent that relates to the particular material, but the rest of it -- building up six or seven layers of interconnections, with all the insulators and barrier metals and everything you need--that's getting to be a lot more of the process than the handling of the silicon itself now.

FUTURE PROCESSOR TECHNIQUES

PC MAGAZINE: Other people are talking about different ideas, again, built based on the same fundamental technology--things like multi-state logic, fuzzy logic. . . things like that neural networks. Is any of this stuff of interest?

GORDON MOORE: You know fuzzy logic evidently has proved to be useful in some of the control systems. I don't understand much about that. Multi-state stuff, with the possible exception of memory, usually turns out to cost so much in time that it's not worth it. People are solving a non-problem in my view when they're looking at [it]. Neural nets are a different deal. Neural networks are a completely different way of doing computing, and do some things very well--like pattern recognition. And if you'd asked me five years ago, well, I would've thought that they would've carved out a nice niche for themselves. The funny thing is, all the neural network stuff that's being used that I know of is simulating neural networks on digital computers. The software simulation turns out to be the way that they're being made instead of getting hardware out there. There's still companies, I guess Synaptics here locally--is pursuing, I think they're still pursuing hardware versions of neural nets.

PC MAGAZINE: I'll just toss out some things that people talk about, they talk about, there's a lot of talk about optoelectric chips, using light more, holographic memory.

GORDON MOORE: I am not a real believer in optoelectronic chips to do computing.

PC MAGAZINE: Holographic memory?

GORDON MOORE: Holography memory? I'm not close enough to that. The densities people talk about being able to achieve are phenomenal: a lot of information storage. It's capable presumably of giving one of these real qualitative leaps in the amount of stuff that's available. Something may come of that. I'm not close enough. On the other hand, the ability of the magnetic disk people to continue to increase the density is flabbergasting--that has moved at least as fast as the semiconductor complexity. You can get a 2.5-gig drive for 200-and-some bucks now down at Fry's! [the Bay Area electronics superstore chain] It's absurd!

PC MAGAZINE: What about some of the quantum computing ideas?

GORDON MOORE: The quantum computing, that's a hot new one. Again, you know--this is getting further and further away from my area of expertise. I think quantum computing is a very interesting concept to understand quantum mechanics. I think it has no practical application whatsoever.

CHIPS IN 15 YEARS

PC MAGAZINE: Okay. . . let's go back more to current things. A few years ago you guys were talking about the Micro 2000 project. You know, I don't hear about that any more--what happened?

GORDON MOORE: We were a little behind in complexity, and are probably going to fall a little behind, mainly for economic reasons. You've got to make bigger chips if you want to get to 50 million transistors; we may only be someplace between 20 and 50 instead. We were right on with respect to the level of technology, minimum feature size--but that's a step function, and we just happened to cross it exactly at the right step. But it looks like that one we'll follow pretty well. The thing that was amazing is that we were significantly ahead in clock frequency and in performance. The people that had done the Micro 2000 were design people, so they put all the pressure on the technologists; they didn't leave anything to themselves, and it turns out the designs have been much more effective than they imagined at the time.

PC MAGAZINE: You talked about the "2011" chip. . .

GORDON MOORE: Oh, yeah, that was Andy [Grove]. He gave a view of that at Comdex and I wouldn't want to take credit for any of that. I think somebody got carried away with his semi-log paper. They were very aggressive extrapolations.

PC MAGAZINE: So you think a 10-gigahertz chip with a billion transistors on a .07-micron process is. . .

GORDON MOORE: By 2011, that's asking for everything to fall in line perfectly. A 10-gigahertz chip of that complexity, to keep the power in anything reasonable and tractable is really tough. The power thing really becomes a problem, and I consider that one of our biggest challenges, particularly for laptop systems, for cheap desktop systems. The power wants to go up to hundreds of watts as you go in this direction, probably thousands of watts if we go that far, and there aren't simple systems for taking that kind of power out. I start to understand why the mainframe people used to circulate Freon and water-cooled and the like. You really are pushing the power as hard as you can and I think that's going to limit these combinations. The .07 is right at where we talked about lithography a little earlier, that's down in the range where we've got to work hard to get there.

Voice Recognition

PC MAGAZINE: So let's say one way or another: 15 years from now we're going to have really powerful chips, right? Compared to what we've got now; I mean, you believe that.

GORDON MOORE: Yeah, I'll admit that, yeah.

PC MAGAZINE: What do you think we're going to be doing?

GORDON MOORE: That's harder to answer. I think there are some things that are very attractive you know the one I always come back to is good voice recognition. I really think a computer you can talk to... it can understand your speech, not only the words, but also the meaning ...is going to change the way computing is done, and I think that is the role that is well worth shooting for. That's the kind of thing that is going to open up computing to the 85 percent of the people who are non-participants today and you know that requires a lot of processing and a lot of memory but I really think it's going to be an attractive deal. You can ask your computer, go out on the Net and get you some information and, you know, like I would ask my technical assistant to go out and get me the data on such-and-such and have the computer come back with it. I think that's fantastic and I think it's doable.

PC MAGAZINE: You think it's doable in this time frame? Sooner?

GORDON MOORE: A lot of it requires investment be made in software, which is obviously nontrivial. A lot of people have worked on voice recognition for a long time, and there are pretty good systems out there now that require a tremendous amount of computing power. I'm not sure which requires the most power, you know the voice recognition for continuous speech, or the intelligence to understand what the speech means. But I think these are real challenges and you know if they're not done by 2010, they'll be done by 2050 or something. They clearly are doable problems. And the more power, the more memory you have, the easier it is to tackle them. You know the other side of it is the computer and other applications--the computer is really proving to be a powerful communications tool. In fact, I think more of them likely to be used because of their communications ability than because of their discrete computational ability. I don't know what the impact of that is going to be; I suspect we're all going to be able to communicate from the almost wherever we are or whenever we want.

Retirement

PC MAGAZINE: Let me just ask a question about you. You had said, you know, you joked about retiring. Are you going to retire, are you here forever, or are you going to do like Bob Noyce did with Sematech after he left?

GORDON MOORE: The last one, Heavens no! You know, I guess I'm semi-retired now; I'm a three-day-a-weeker. But that's enough that I can keep up with a lot of the stuff that's going on, maintain my touch with the excitement in the industry, and boy, if the alternative is staying at home and taking out the garbage, I'm going to stick around here for a long time.

Copyright (c) 2002 Ziff Davis Media Inc. All Rights Reserved.