a history of modern computing 2nd edition phần 8 pps

45 340 0
a history of modern computing 2nd edition phần 8 pps

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

He remained close to, but always outside of the academic and research community, and his ideas inspired work at Brown University, led by Andries van Dam. 45 Independently of these researchers, Apple intro- duced a program called HyperCard for the Macintosh in 1987. Hyper- Card implemented only a fraction of the concepts of hypertext as van Dam or Nelson understood the concept, but it was simple, easy to use, and even easy for a novice to program. For all its limits, HyperCard brought the notion of nonlinear text and graphics out of the laboratory setting. In the midst of all that sprouted the Internet, with a sudden and unexpected need for a way to navigate through its rich and ever- increasing resources. 46 It is still too early to write the history of what happened next. Tim Berners-Lee, who wrote the original Web prototype in late 1990, has written a brief memoir of that time, but the full story has yet to be told. 47 Berners-Lee developed the Web while at CERN, the European particle physics laboratory. He stated that ‘‘[t]he Web’s major goal was to be a shared information space through which people and machines could communicate. This space was to be inclusive, rather than exclusive.’’ 48 He was especially concerned with allowing commu- nication across computers and software of different types. He also wanted to avoid the structure of most databases, which forced people to put information into categories before they knew if such classifica- tions were appropriate or not. To these ends he devised a Universal Resource Identifier (later called the Uniform Resource Locator or URL) that could ‘‘point to any document (or any other type of resource) in the universe of information.’’ 49 In place of the File Transfer Protocol then in use, he created a more sophisticated Hypertext Transfer Protocol (HTTP), which was faster and had more features. Finally, he defined a Hypertext Markup Language (HTML) for the movement of hypertext across the network. Within a few years, these abbreviations, along with WWW for the World Wide Web itself, would be as common as RAM, K, or any other jargon in the computer field. The World Wide Web got off to a slow start. Its distinctive feature, the ability to jump to different resources through hyperlinks, was of little use until there were at least a few other places besides CERN that supported it. Until editing software was written, users had to construct the links in a document by hand, a very tedious process. To view Web materials one used a program called a ‘‘browser’’ (the term may have originated with Apple’s Hypercard). Early Web browsers (including two called Lynx and 302 Chapter 9 Viola) presented screens that were similar to Gopher’s, with a lists of menu selections. Around the fall of 1992 Marc Andreessen and Eric Bina began discussing ways of making it easier to navigate the Web. While still a student at the University of Illinois, Andreessen took a job programming for the National Center for Supercomputing Applications, a center set up with NSF money on the campus to make supercomputing more accessible (cf. the impetus for the original ARPANET). By January 1993 Andreessen and Bina had written an early version of a browser they would later call Mosaic, and they released a version of it over the Internet. 50 Mosaic married the ease of use of Hypercard with the full hypertext capabilities of the World Wide Web. To select items one used a mouse (thus circling back to Doug Engelbart, who invented it for that purpose). One knew an item had a hyperlink by its different color. A second feature of Mosaic, the one that most impressed the people who first used it, was its seamless integration of text and images. With the help of others at NCSA, Mosaic was rewritten to run on Windows-based machines and Macintoshes as well as workstations. As a product of a government-funded laboratory, Mosaic was made available free or for a nominal charge. As with the UNIX, history was repeating itself. But not entirely: unlike the developers of UNIX, Andreessen managed to commercialize his invention quickly. In early 1994 he was approached by Jim Clark, the founder of Silicon Graphics, who suggested that they commercialize the invention. Andreessen agreed, but apparently the University of Illinois objected to this idea. Like the University of Pennsylvania a half-century before it, Illinois saw the value of the work done on its campus, but it failed to see the much greater value of the people who did that work. Clark left Silicon Graphics, and with Andreessen founded Mosaic Communications that spring. The University of Illinois asserted its claim to the name Mosaic, so the company changed its name to Netscape Communications Corporation. Clark and Andreessen visited Champaign-Urbana and quickly hired many of the programmers who had worked on the software. Netscape introduced its version of the browser in September 1994. The University of Illinois continued to offer Mosaic, in a licensing agreement with another company, but Netscape’s software quickly supplanted Mosaic as the most popular version of the program. 51 On August 8, 1995, Netscape offered shares to the public. Investors bid the stock from its initial offering price of $28 a share to $58 the first day; that evening the network news broadcast stories of people who had Workstations, UNIX, and the Net, 1981–1995 303 managed to get shares at the initial price. The public now learned of a crop of new ‘‘instant billionaires,’’ adding that knowledge to their awareness of ‘‘dot.com,’’ ‘‘HTTP,’’ and ‘‘HTML.’’ Within a few months Netscape shares were trading at over $150 a share, before falling back. Reading the newspaper accounts and watching the television news, one had the feeling that the day Netscape went public marked the real beginning of the history of computing, and that everything else had been a prologue. For this narrative, that event will mark the end. Conclusion Since 1945 computing has never remained stagnant, and the 1990s were no exception. The emergence of the Internet was the biggest story of these years, although it was also a time of consolidation of the desktop computer in the office. Desktop computing reached a plateau based on the Intel, DOS, Macintosh, and UNIX standards that had been invented earlier. Most offices used personal computers for word processing, spreadsheets, and databases; the only new addition was communications made possible by local-area networking. A new class of computer emerged, called the laptop (later, as it lost more weight, the notebook), but these were functionally similar to PCs. Indeed, they were advertised as being software-compatible with what was on the desk. The basic architectural decisions made in the late 1970s, including the choice of a microprocessor and the structure of a disk operating system, remained (with RISC a small but significant exception). Networking promised for some a potential conceptual shift in computing, but as of 1995 it had not replaced the concept of an autonomous, general-purpose computer on individual desks. As the World Wide Web matures, some argue that all the consumer will need is a simple Internet appliance—a reincarnation of the dumb terminal—not a general-purpose PC. But the numerous examples cited in this study—the IBM 650, the 1401, the PDP-8, the Apple II—all support the argument that the market will choose a good, cheap, general-purpose computer every time. The biggest story of the 1990s was how the Altair, a $400 kit of parts advertised on the cover of Popular Electronics, managed to bring down the mighty houses of IBM, Wang, UNIVAC, Digital, and Control Data Corporation. IBM almost made the transition with its personal compu- ter, but its inability to follow through on the beachhead it established led to multi-billion-dollar losses between 1991 and 1993. 52 Personal computer profits went increasingly to new companies like Dell, Compaq, and 304 Chapter 9 above all Microsoft. IBM recovered, but only after abandoning its no- layoff policy (which it had held to even through the 1930s), and when it emerged from that crisis it found Microsoft occupying center stage. Even the American Federation of Information Processing Societies (AFIPS), the umbrella trade organization of computer societies, perished on December 31, 1990. 53 Of course it was not simply the $400 Altair that changed computing. DEC and Data General had a lot to do with that as well, but neither DEC nor Data General were able to build on the foundations they had laid. One could understand IBM’s failings, with its tradition of punched-card batch processing, and its constant courtroom battles against plaintiffs charging that it was too big. It is not as easy to understand how the Route 128 minicomputer companies failed to make the transition. These were the companies that pioneered in processor and bus architectures, compact packaging, interactive operation, and low unit costs. Led by General Doriot of the Harvard Business School, they also were the first to do what later became a defining characteristic of Silicon Valley: to start up a technology-based company with venture capital. Netscape generated so much public interest because it showed that this tradition was still alive. There was even a possibility that this company, founded to exploit a model of computing centered on the Internet, might be able to do to Microsoft what Microsoft had just done to DEC, IBM, and the others who were founded on earlier, now-outdated models of comput- ing. As of 1995 Digital and Data General were still in business, although both were struggling and much-reduced in size. Data General’s decline began in the early 1980s, just when Tracy Kidder’s The Soul of a New Machine became one of the first books about the computer industry to get on the best-seller list. That book chronicled Data General’s attempt to chase after the VAX and regain its leadership in minicomputers. It captured the youth, energy, and drive of the computer business, and it remains an accurate description of the computer business today. Lacking the 20-20 hindsight that we now all have, Kidder did not, however, mention how Data General’s Nova, the ‘‘clean machine,’’ had inspired the designers of personal computers, including Ed Roberts and Steve Wozniak. Someone at Data General may have recommended an alter- nate course: that it ignore the VAX and concentrate instead on the small systems it had helped bring into being. If so, Kidder’s book does not record it. Workstations, UNIX, and the Net, 1981–1995 305 In 1992, Ken Olsen resigned as head of Digital, as the company he founded was heading toward bankruptcy. A typical news story contrasted Olsen and a tired DEC with the young Bill Gates and his vibrant Microsoft. Few saw the irony of that comparison. Gates learned how to program on a PDP-10, and we have seen DEC’sinfluence on Microsoft’s software. More than that: Digital Equipment Corporation set in motion the forces that made companies like Microsoft possible. One person was quoted stating that were it not for Olsen we would still be programming with punched cards. That sounded like a generous overstatement made out of sympathy; in fact, one could credit him with doing that and much more. Modern computing is a story of how a vision of ‘‘man-machine symbiosis,’’ in J. C. R. Licklider’s term, came to fruition. That happened through the efforts of people like Licklider himself, as well as Doug Engelbart, Ted Hoff, Ed Roberts, Steve Jobs, Steve Wozniak, Bill Gates, Gary Kildall, Tim Berners-Lee, and many others. To that list, perhaps near the top, should be added the name Ken Olsen. The ‘‘creative destruction’’ of the capitalist system had worked wonders, but the process was neither rational nor fair. 306 Chapter 9 10 ‘‘Internet Time,’’ 1995–2001 The narrative in the first edition ended on August 8, 1995, the day that Netscape offered shares on the stock market. The commercialization of the Internet, and the role that Netscape played in it, ushered in a new era in computing. It is too early to write a history of this era. There is no clear theoretical framework on which the historian can build a narrative. Still, so much has happened in the past few years that one cannot put off an attempt to write some kind of historical narrative about the ‘‘dot.com’’ phenomenon. A section of this chapter will do that, but this chronicle of the inflation and bursting of the dot.com bubble is very much a work in progress. This chapter also addresses two other developments of the past few years. Like the dot.com phenomenon, these are ongoing developments whose direction seems to change daily if one reads the newspaper headlines. Fortunately, these developments have nice connections to events of computing’s ‘‘ancient history’’ (i.e., before 1995). Thus they allow the historian to gain a glimmer of perspective. The antitrust trial against Microsoft, discussed first, is the culmination of a sequence of legal actions taken against the company, and it reflects issues that were present at Microsoft as early as 1975, when the company was founded. Not only that, the Microsoft trial echoes many of the arguments made against IBM during its legal troubles with the U.S. Justice Department in the 1970s. The discussion of the GNU/Linux operating system and the ‘‘open source’’ software movement, discussed last, likewise has deep roots. Chapter 3 discussed the founding of SHARE, as well as the controversy over who was allowed to use and modify the TRAC programming language. GNU/Linux is a variant of UNIX, a system developed in the late 1960s and discussed at length in several earlier chapters of this book. UNIX was an open system almost from the start, although not quite in the ways that ‘‘open’’ is defined now. As with the antitrust trial against Microsoft, the open source software movement has a strong tie to the beginnings of the personal computer’s invention. Early actions by Microsoft and its founders played an important role here as well. We begin with the antitrust trial. Microsoft A commercial, aired during the third quarter of the game, was the most memorable part of the broadcast of the January 1984 Super Bowl (see chapter 8). The Macintosh, Apple assured us, would usher in a new era of personal computing, and therefore the year 1984 would not be one of dreary conformity and oppression as prophesied by George Orwell’s novel 1984. A revolution in personal computing was indeed in the works, and the Macintosh was leading the way. But Microsoft, not Apple, helped bring the revolution to a mass market. That happened not in 1984, the year the Mac appeared, but in 1992, when Microsoft began shipping version 3.1 of its Windows program. In 1984, Apple hoped that the Mac would bring the innovative ideas from the Xerox Palo Alto Research Center, ideas already present in a few personal computer systems, to the consumer. A dozen years later, Microsoft, not Apple, would dominate personal computer software. 1 And that domination, in turn, would lead to its entanglement in a bitter antitrust trial. Just as IBM spent a significant fraction of its resources during the 1970s facing a challenge by the U.S. Justice Department, so too is Microsoft in the same situation, following a similar filing against it in 1997. In November 2001 the federal government announced a settle- ment, but several states, and the European Union, refused to go along. Their arguments were also rejected by a ruling on November 1, 2002. Almost daily, the business press reports whenever a judge or lawyer makes a statement. Until the case is settled, one can only make provisional comments about its significance. The lesson of the IBM trial, however, applies to the present case against Microsoft: namely that the Justice Department is not a place that recognizes how advancing technology will render much of the lawsuit irrelevant. What is the Microsoft-equivalent of the personal computer, whose appearance in the midst of the IBM trial was ignored as the litigants fought over mainframe dominance? It is too early to tell, although I will discuss some candidates later in this chapter. What is certain is that advances in computing already threaten, and will continue to threaten, Microsoft’s 308 Chapter 10 ability to dominate personal computing, based on its Windows and Office software. The licensing policies of Microsoft and Intel gave rise to clone manufacturers, like Dell, Compaq, and Gateway, who provided choices unavailable to Apple customers. (Apple, for most of its history, has refused to license its Macintosh software to third-party computer makers.) That policy yielded a greater variety of products and, above all, lower prices for computers based on Intel microprocessors and running Microsoft’s DOS and then Windows. Windows version 3.1, Intel’s introduction of the Pentium processor, and Microsoft’s combin- ing applications software into a suite called Microsoft Office, combined to give consumers, let’s say, 80 percent of what the Macintosh was offering, at a lower price for the total package. To Apple’s surprise (and to the chagrin of Mac fans), that percentage was good enough to tip the balance, perhaps forever, away from Apple. By 1995 the advantage of Apple’s more elegant design no longer mattered, as the Microsoft/Intel combination became a standard, like COBOL in the 1960s. As with COBOL, what mattered was the very existence of a standard, not the intrinsic value or lack thereof of the software. The Macintosh Connection One could begin this story of Microsoft’s triumph and troubles at any number of places, but the introduction of the Mac conveniently allows us to identify several critical factors. The first was that when the Mac appeared in 1984, it had a magnificent user interface but almost no applications software—the programs that people actually bought perso- nal computers for. The most interesting application that it did have was MacPaint, a drawing program descended from the pioneering work at Xerox, and something that no software for IBM compatibles could approach. But for word processing, an application that any serious new computer had to have, Apple offered only MacWrite, which took advantage of its graphical interface, but which otherwise was extremely limited in capability. 2 Both MacPaint and MacWrite were developed in-house. Besides those programs, early Mac customers could also get a spread- sheet: Multiplan, developed by Microsoft for other computers but ported to the Mac. Although some popular accounts enjoy setting up Bill Gates and Steve Jobs as mortal enemies, for much of this period the two men had a cordial and mutually beneficial business relationship. At the onset of the Mac’s development, in June 1981, Jobs and Jef Raskin (who had ‘‘Internet Time,’’ 1995–2001 309 the initial idea for the Macintosh) met with Gates, and in January of the following year Microsoft agreed to develop software for the new machine. 3 Gates needed little convincing of where personal computing was going. Even as Microsoft was negotiating to supply DOS for the IBM PC in 1980, Gates hired a programmer who would take the company in the opposite direction. That was Charles Simonyi, a native of Hungary who learned how to program first on a Soviet-built vacuum-tube compu- ter called the Ural-II, then on a Danish transistorized computer that had an advanced ALGOL compiler installed on it. In the 1970s Simonyi worked at Xerox-PARC, where he developed a word processor called ‘‘Bravo’’ for the Alto workstation. Bravo is often credited with having the first true WYSIWYG (‘‘What-You-See-Is-What-You-Get’’) display, a concept that other Xerox employees brought with them to Apple. 4 In 1985 Microsoft produced another spreadsheet, Excel, for the Macintosh, which took advantage of all that the Macintosh interface had to offer. Excel was a success and helped Apple get through a difficult period when Mac sales were in danger of completely drying up. Mac users finally had a spreadsheet that was comparable to Lotus 1-2-3 on the IBM PCs. For its efforts, Microsoft gained something too: besides winning a commercial success, Microsoft programmers learned how to develop software for a Windows-based interface—something that Lotus and Word Perfect would have a hard time learning. The ultimate impact of hiring Simonyi, and of these interactions between Microsoft and Apple, was that Bill Gates decided to recreate the Macintosh experience on the Intel 80686 platform. Consider the context of that decision. In the mid-1980s, ‘‘Windows’’ was but one of many graphical systems (e.g., VisiOn, GEM, et al.) proposed for IBM compatibles. And Microsoft’s applications programs, like Multiplan, were not as well regarded by industry critics as programs like Lotus 1-2-3 or Word Perfect. The Windows model was also being challenged by a competing idea, mainly from Lotus: that of a single program, running under DOS, that combined spreadsheets, databases, and word proces- sing. Lotus offered such a program called Symphony for the IBM PC and was working on one for the Mac called Jazz. At Ashton-Tate, the leading supplier of database software for the PC, a Xerox-PARC alumnus named Robert Carr was developing a similar program called Framework. 5 It turned out that the practice of keeping the applications separate, while requiring that each adhere to a common graphical interface, would prevail. 6 That was what Jobs insisted on for all Macintosh 310 Chapter 10 developers, and Gates made it the focus (slightly less sharp) at Microsoft. Simonyi developed a system of programming that allowed Microsoft to manage increasingly larger and more complex programming jobs as the company grew. The style involved a way of naming variables, and was called ‘‘Hungarian,’’ an inside joke referring to its incomprehensibility to anyone not familiar with Microsoft’s programming, like Simonyi’s native Hungarian language supposedly is to speakers of other European languages. 7 ‘‘Hungarian’’ may not have been the crucial factor, but somehow Microsoft’s managers learned to manage the development and intro- duction of complex software written by ever-larger teams of program- mers. One other technique was especially innovative. Although it had been developed elsewhere, Microsoft embraced this technique and applied it on a large scale not seen elsewhere, and broke radically from the way large projects were managed at mainframe software houses. At Microsoft, programmers working on a section of a new product were required to submit their work to a central file at the end of each day, where overnight it would be compiled, along with everyone else’s, into a daily ‘‘build.’’ 8 If your contribution caused the central file to crash, you were responsible for fixing it. That build then became the basis for the next day’s work. 9 What was more, as soon as the build became marginally functional, members of the programming team were required to use it, regardless of how inefficient that might be. This requirement made life difficult, especially when the software was in an early stage and little of it worked well, but it kept the programmers focused on shipping a finished product of high quality. This process, too, had an evocative name: ‘‘eating your own dog food.’’ 10 The public has since become aware of the large fortunes amassed by Microsoft program- mers who worked there long enough to have their stock options vest. Less well known is the dog’s life of no sleep, eating out of vending machines, endless hours spent staring into a computer screen, no social or family life, and other tribulations for a programmer caught in the ‘‘death march’’ of fixing a ‘‘broken’’ build while getting a program finished on time. 11 The cumulative effect of these efforts was a steady stream of ever- improved versions of Windows and an ever-expanding suite of applica- tions. Word and Excel were the two pillars of applications software, soon joined by the database program Access, the presentation program PowerPoint, the project management program Project, and a host of others (table 10.1). Microsoft purchased many of these programs from ‘‘Internet Time,’’ 1995–2001 311 [...]... not involve Microsoft but did introduce a phrase that would figure in later trials This was a suit filed in 1 987 by Lotus against a company called Paperback Software, established by Adam Osborne of portable computer fame Paperback was selling a spreadsheet that functioned identically to 1-2-3, but at a fraction of the price.14 Lotus charged that Paperback, even if it did not copy or steal Lotus’s code,... browser, Netscape’s home page was the first thing they saw, and many never bothered to change that As Netscape’s management was railing against Microsoft, the company did not realize, until it was too late, that it had also invented a ‘‘portal’’ but did not know it Netscape was eventually acquired by AOL, another company adept at easing ordinary people into the complexities of cyberspace Computer-savvy Internet... creation.)69 Eric Raymond was among the programmers who saw the merits of this model of software development, and, in an influential essay called ‘‘The Cathedral and the Bazaar,’’ he analyzed it with regard to writings by Brooks and others about large software projects.70 Raymond argued that by letting people look at, and modify, the source code, bugs are found and fixed As they do that, the general... 1995 that caused a minor ripple in the trade press, but in hindsight it appears to have further enraged Microsoft’s competitors and people in the Justice Department That year, Microsoft announced that it would buy Intuit, the maker of the financial program Quicken and one of the few independent suppliers of an application that had a commanding market share After Microsoft initiated the purchase (and,... users did not need a portal They preferred brute-force search engines and were not afraid to construct complex searches using Boolean algebra to find what they wanted An early leader was Altavista, founded by the Silicon Valley lab of Digital Equipment Corporation Altavista’s success was not enough to rescue its parent company, and by late 19 98 its search engine was surpassed by a rival, Google, founded... Minix was A brief note posted to a newsgroup in July 1991 gave a hint to the world that he was thinking not just of an expanded terminal emulator but of a 386 implementation of UNIX that would conform to a standard set out by a subcommittee of the International Standards Organization.64 One person responded to his query not with information on the standard but with an offer of space on a computer at the... creator and user that Torvalds developed In spite of all the various flavors of UNIX then available, Linux was filling a need The fragmentation of UNIX, mentioned in chapter 9, turned to Torvalds’s advantage His version allowed users to get the operating system with none of the baggage associated with other versions then available AT&T had hoped to profit from its creation once the company was free to market... not just for its Bay-area countercultural flavor but also for the quality of the postings It attracted a large number of writers and remains focused on writing, not graphics or multimedia It was the home for an especially active group of fans of the Grateful Dead, who traded information about the band According to chronicles of the era, the traffic from Deadheads kept the WELL financially solvent through... rapidly gaining popularity, but with more of the low-level power of C or assembly language He called on programmers to write a language that was, he said, ‘‘C-plus-plus-minus.’’41 Beginning in 1991 James Gosling, along with a small team of other programmers at SUN, came up with a language, called Oak, that filled Joy’s needs With Joy’s support, it was reworked, renamed ‘‘Java’’ (for legal reasons), and publicly... this time As Java caught on, it garnered media interest all out of proportion to what the language actually did, and that was unfortunate Java’s writeonce, run-anywhere feature was heralded in the trade press not as a way to do something interesting on the Web, but to break Microsoft’s hold on personal computing software If people could write programs on large servers, and have those programs sent to . on Windows-based machines and Macintoshes as well as workstations. As a product of a government-funded laboratory, Mosaic was made available free or for a nominal charge. As with the UNIX, history was. and was working on one for the Mac called Jazz. At Ashton-Tate, the leading supplier of database software for the PC, a Xerox-PARC alumnus named Robert Carr was developing a similar program called. it agreed to make available the parts of Windows code that interacted with applications programs: the so-called Applications Program Interface, or API. Thus, for example, developers of a database

Ngày đăng: 14/08/2014, 20:20

Từ khóa liên quan

Mục lục

  • A History of Modern Computing

    • 9 Workstations, UNIX, and the Net, 1981–1995

      • Conclusion

    • 10 ‘‘Internet Time,’’ 1995–2001

      • Microsoft

      • The Macintosh Connection

      • Internet Explorer

      • Hotmail, UNIX

      • Dot. Com

      • The Acceptable Use Policy

      • Java

      • Search Engines, Portals

      • Tragedy of the Commons

      • GNU/ Linux

      • GNU

      • IBM

      • Conclusion

    • Conclusion: The Digitization of the World Picture

      • The Digitization of the World Picture

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan