------------------------------------------------------------------------- | TTTTT H H EEEE | | T H H E | | T HHHH EEE | | T H H E | | T H H EEEE | | | | A M M A TTTTTTT EEEEE U U RRRR | | A A M M M M A A T E U U R R | | A A M M M M A A T EEE U U RRRR | | AAAAA M MM M AAAAA T E U U R R | | A A M M A A T EEEEE UUU R R | | | | CCCC OO MM MM PPP U U TTTTT EEEE RRRR III SSS TTTTT | | C O O M M M P P U U T E R R I S T | | C O O M M M PPPP U U T EEE RRRR I S T | | C O O M M P U U T E R R I S T | | CCCC OO M M P UU T EEEE R R III SSS T | |-------------------------------------------------------------------------| | Winter 1998-1999 Battle Over the Future of the Internet Volume 9 No 1 | ------------------------------------------------------------------------- Table of Contents [1] Editorial: 25 Years of TCP/IP . . . . . . . 9147 bytes [2] Role of Govt in Internet Evolution. . . . . 28430 bytes [3] Report from INET98 and IFWP . . . . . . . . 10331 bytes [4] The Internet: Public or Private?. . . . . . 10808 bytes [5] Report from the Front. . . . . . . . . . . 13465 bytes [6] The Internet a Public Treasure . . . . . . 15700 bytes [7] Testimony Submitted to Congress. . . . . . 21218 bytes [8] Letter to Congressman Bliley . . . . . . . 15845 bytes [9] E-mail Message from Becky Burr. . . . . . . 2464 bytes [10] Letter to Wm. Daley Sec of Commerce . . . . 10288 bytes [11] Letter: Tom Bliley to Ira Magaziner . . . . 2327 bytes [12] Letter to the NTIA. . . . . . . . . . . . . 566 bytes [13] Herding Cats and Sacred Cows. . . . . . . . 11693 bytes [14] DNS: Short History and Short Future . . . . 24623 bytes [15] MsgGroup Mailing List . . . . . . . . . . . 25541 bytes --------------------------------------------------------------------- [1] 25 Year Anniversary of TCP/IP by Ronda Hauben ronda@panix.com The following post recently appeared on Usenet: "A phenomenon that has resulted from IT development has been that of the Internet. Why has the impact of Internet been so very great on society? What was the fundamental needs of society, which had remained dormant till now, which are spurring on these developments at such a rapid pace? That Internet an innovative medium, is made possible by several technologies and techniques. One is TCP/IP. Make an independent evaluation of the TCP/IP dimensions of (the) Internet and impact of TCP/IP on the Internet." This issue of the Amateur Computerist is being published at a time of a milestone that needs to make anyone who cares about the Internet pause and reflect. In 1973, the Internet protocol TCP/IP (then called TCP) was designed by Robert E. Kahn and Vinton G. Cerf.(1) Their paper "A Protocol for Packet Network Intercommunication," describing the architecture of the TCP protocol was published in May 1994 in IEEE's "The Transactions on Communications." As another Internet pioneer, Dave Clark, understood, TCP was the glue that brought together several important network technologies. This new protocol made it possible for dissimilar packet switching networks to be able to talk with each other, in a way similar to how an earlier protocol NCP had made it possible for diverse computers using different operating systems to communicate via the ARPANET. What is so important about the creation of this new protocol, TCP, as it was called in 1973, was that it made possible the logical connection of multiple packet switching networks around the world. This has created a communications system that has grown and spread broadly and widely. More importantly, the internetworking of networks made possible by TCP/IP is the basis of a system that makes it possible for people around the world to communicate via their computers in a way that is unprecedented. Thus this issue of the Amateur Computerist is dedicated to raising a rousing cheer for the networking pioneers whose dedications, hard work, and pioneering vision conceived of and created this important means to facilitate networking interconnection and communication and thus human to human communication. And then some of these pioneers took on the difficult tasks of implementing the protocol in a variety of packet switching networks, eventually making it possible for TCP/IP and the Internet to spread around the U.S. and around the world. The article in this issue by Robert E. Kahn, one of the most important of these pioneers, describes both the development of internetworking technology, and some of the other problems that had to be solved to develop the Internet to what it is today. Though written in 1994, the article also describes some of the outstanding problems that he understood the Internet would face as it continued to grow and spread. The article provides an important description of the changing role that the U.S. government has played in the creation and development of the Internet. And it raises the question of what role will government, both the U.S. government, and other governments around the world, need to play in the further development of the Internet as these networking developments continue to grow and spread more broadly and widely? Commenting on the importance of the need to determine the role for government in the present and the future development of the Internet, Kahn writes: This raises the question of the proper long-term role for government in the continued evolution of the Internet. Is the Internet now in a form where government involvement should cease entirely, leaving private-sector interests to determine its future? Or, does government still have an important role to play? This paper concludes that government can still make a series of important contributions. This question continues to be alive today as the decision making processes that will help the Internet to scale are under reconsideration, and the role of government with regard to these processes, hasn't yet been determined. This issue also starts the serialization of a paper about one of the earliest mailing lists created during the early days of the ARPANET. The MsgGroup mailing list was started in 1975, shortly after the creation of TCP/IP. It was created to explore how e-mail facilitated communication and collaborative activity. One of the papers included in the archives of this mailing list recognizes that how decisions are made regarding the developing network would become a problem as adequate consideration wasn't paid to this challenge. This prediction has proven true. Most recently, the problem of how decisions are made with respect to domain names on the Internet has revealed that this early paper was insightful, as the question of decision making, along with the issue of what continuing role governments need to play in overseeing such a decision making process has become an urgent problem to be solved for the ongoing development of the Internet. The article by Robert Shaw, of the International Telecommunications Union (ITU) in Geneva describes the problem that has developed with regard to the plan by the U.S. government to transfer not only decisions regarding domain naming, but also the domain name system to the private sector. This issue of the Amateur Computerist also contains an article by Ted Byfield discussing some of the various considerations that the domain name controversy raises. Other articles in this issue include testimony submitted to the U.S. Congress, and via e-mail as well, regarding the problem of the U.S. government's decision to make a significant change in not only the decision making process regarding essential Internet functions, but also in the ownership and control over these essential functions of the Internet. Also included is a proposal submitted via e-mail to government policy advisors, and then posted at the NTIA online web site, a report from the Internet Society meeting in Geneva this past July, and a letter to Congress, and from Congress to the Department of Commerce about the problems of transferring decision making and Internet assets from the U.S. government oversight to a private entity. On November 25, 1998, a Memorandum of Understanding (MoU) was posted online by the NTIA indicating a cooperative agreement with the private corporation they had created, ICANN, to design and test a private sector corporate entity. However, for now, the U.S. government has claimed that it hasn't yet transferred these functions and instead will be working with ICANN to design a structure. The MoU is online at the NTIA web site, and we welcome views about the nature of this agreement.(2) We hope to have an analysis of it in our next issue. Finally, 1998 marked another important Internet milestone. In 1988 the NSFNET backbone was put into operation. 1988 was also the year that I first got onto the Internet via the MERIT connection to the NSFNET backbone. When I begin to think how different my life would be today without the Internet, it makes me realize the remarkable changes that are possible with the ability to communicate as broadly and widely as the Internet makes possible. More profoundly, the communication made possible via the Internet makes it possible to solve problems that otherwise would be intractable. This capability carries with it a profound hope for the future. So I want to express my personal thanks to those determined pioneers who have brought the world these important new means of global communication. Now it is up to the rest of us to help take up the problems that develop along the way so that this new communications media will spread ever more broadly and widely, and the visions of the pioneers that all gain access, be achieved. That's what this issue about. ----- Notes: (1) See also John Adam, "Architects of the net of nets," IEEE Spectrum, September 1996, p. 57-63. (2) http://www.ntia.doc.gov/ (3) In early January, the NIST (National Institute for Standards and Technology of the U.S. government) announced that it will give ICANN the IANA contract in place of DARPA. This move is contrary to both the stated Memorandum of Agreement that the NTIA signed with ICANN on November 24, 1998 providing only that ICANN design and test a structure, not that they actually administer IANA. Also this NIST announcement was contrary to the report by the Office of Inspector General of the NSF issued in February 1997 that stated that the U.S. government was not allowed to contract out policy setting functions, but only administrative functions. The U.S. government is creating ICANN to function as a policy setting body for it, which is contrary to what it is allowed to do with a private sector organization. --------------------------------------------------------------------- [2] The Role of Government in the Evolution of the Internet* by Robert E. Kahn *[Communications of the ACM, Vol. 37, No. 8, Aug. 1994, (c)1994 ACM, Inc. Reprinted by permission.] This paper discusses the role of government in the continuing evolution of the Internet. From its origins as a U.S. government research project, the Internet has grown to become a major component of a network infrastructure, linking millions of machines and tens of millions of users around the world. Although many nations are now involved with the Internet in one way or another, this paper focuses on the primary role the U.S. government has played in the Internet's evolution and discusses the role that governments around the world may have to play as it continues to develop. Very little of the current Internet is owned, operated, or even controlled by governmental bodies. The Internet indirectly receives government support through federally funded academic facilities that provide some network-related services. Increasingly, however, the provision of Internet communication services, regardless of use, is being handled by commercial firms on a profit-making basis. This situation raises the question of the proper long-term role for government in the continued evolution of the Internet. Is the Internet now in a form where government involvement should cease entirely, leaving private-sector interests to determine its future? Or, does government still have an important role to play? This paper concludes that government can still make a series of important contributions. Indeed, there are a few areas in which government involvement will be vital to the long-term well-being of the Internet. ORIGINS OF THE INTERNET The Internet originated in the early 1970s as part of an Advanced Research Projects Agency (ARPA) research project on "internetworking." At that time, ARPA demonstrated the viability of packet switching for computer-to-computer communication in its flagship network, the ARPANET, which linked several dozen sites and perhaps twice that number of computers into a national network for computer science research. Extensions of the packet-switching concept to satellite networks and to ground-based mobile radio networks were also under development by ARPA, and segments of industry (notably not the traditional telecommunications sector) were showing great interest in providing commercial packet network services. It seemed likely that at least three or four distinct computer networks would exist by the mid 1970s and that the ability to communicate among these networks would be highly desirable if not essential. In a well-known joint effort that took place around 1973, Robert Kahn, then at ARPA, and Vinton Cerf, then at Stanford, collaborated on the design of an internetwork architecture that would allow packet networks of different kinds to interconnect and machines to communicate across the set of interconnected networks. The internetwork architecture was based on a protocol that came to be known as TCP/IP. The period from 1974 to 1978 saw four successively refined versions of the protocol implemented and tested by ARPA research contractors in academia and industry, with version number four eventually becoming standardized. The TCP/IP protocol was used initially to connect the ARPANET, based on 50 kilobits per second (kbps) terrestrial lines; the Packet Radio Net (PRNET), based on dual rate 400/100 kbps spread spectrum radios; and the Packet Satellite Net (SATNET), based on a 64 kbps shared channel on Intelsat IV. The initial satellite Earth stations were in the United States and the United Kingdom, but subsequently additional Earth stations were activated in Norway, Germany, and Italy. Several experimental PRNETs were connected, including one in the San Francisco Bay area. At the time, no personal computers, workstations, or local area networks were available commercially, and the machines involved were mainly large-scale scientific time-sharing systems. Remote access to time-sharing systems was made available by terminal access servers. The technical tasks involved in constructing this initial ARPA Internet revolved mainly around the configuration of "gateways," now known as routers, to connect different networks, as well as the development of TCP/IP software in the computers. These were both engineering-intensive tasks that took considerable expertise to accomplish. By the mid-1980s, industry began offering commercial gateways and routers and started to make available TCP/IP software for some workstations, minicomputers, and main frames. Before this, these capabilities were unavailable; they had to be handcrafted by the engineers at each site. In 1979, ARPA established a small Internet Configuration Control Board (ICCB), most of whose members belonged to the research community, to help with this process and to work with ARPA in evolving the Internet design. The establishment of the ICCB was important because it brought a wider segment of the research community into the Internet decision-making process, which until then had been the almost-exclusive bailiwick of ARPA. Initially, the ICCB was chaired by a representative of ARPA and met several times a year. As interest in the ARPA Internet grew, so did interest in the work of the ICCB. During this early period, the U.S. government, mainly ARPA, funded research and development work on networks and supported the various networks in the ARPA Internet by leasing and buying components and contracting out the system's day-to-day operational management. The government also maintained responsibility for overall policy. In the mid to late 1970s, experimental local area networks and experimental workstations, which had been developed in the research community, were connected to the Internet according to the level of engineering expertise at each site. In the early 1980s, Internet-compatible commercial workstations and local area networks became available, significantly easing the task of getting connected to the Internet. The U.S. government also awarded contracts for the support of various aspects of Internet infrastructure, including the maintenance of lists of hosts and their addresses on the network. Other government-funded groups monitored and maintained the key gateways between the Internet networks in addition to supporting the networks themselves. In 1980, the U.S. Department of Defense (DOD) adopted the TCP/IP protocol as a standard and began to use it. By the early 1980s, it was clear that the internetwork architecture that ARPA had created was a viable technology for wider use in defense. EMERGENCE OF THE OPERATIONAL INTERNET The DOD had become convinced that if its use of networking were to grow, it needed to split the ARPA Internet (called ARPANET) in two. One of the resulting networks, to be known as MILNET, would be used for military purposes and mainly link military sites in the United States. The remaining portion of the network would continue to bear the name ARPANET and still be used for research purposes. Since both would use the TCP/IP protocol, computers on the MILNET would still be able to talk to computers on the new ARPANET, but the MILNET network nodes would be located at protected sites. If problems developed on the ARPANET, the MILNET could be disconnected quickly from it by unplugging the small number of gateways that connected them. In fact, these gateways were designed to limit the interactions between the two networks to the exchange of electronic mail, a further safety feature. By the early 1980s, the ARPA Internet was known simply as the Internet, and the number of connections to it continued to grow. Recognizing the importance of networking to the larger computer science community, the National Science Foundation (NSF) began supporting CSNET, which connected a select group of computer science researchers to the emerging Internet. This allowed new research sites to be placed on the ARPANET at NSF's expense, and it allowed other new research sites to be connected via a commercial network, TELENET, which would be gatewayed to the ARPANET. CSNET also provided the capacity to support dial-up e-mail connections. In addition, access to the ARPANET was informally extended to researchers at numerous sites, thus helping to further spread the networking technology within the scientific community. Also during this period, other federal agencies with computer-oriented research programs, notably the Department of Energy (DoE) and the National Aeronautics and Space Administration (NASA), created their own "community networks." The TCP/IP protocol adopted by DOD a few years earlier was only one of many such standards. Although it was the only one that dealt explicitly with internetworking of packet networks, its use was not yet mandated on the ARPANET. However, on January 1, 1983, TCP/IP became the standard for the ARPANET, replacing the older host protocol known as NCP. This step was in preparation for the ARPANET-MILNET split, which was to occur about a year later. Mandating the use of TCP/IP on the ARPANET encouraged the addition of local area networks and also accelerated the growth in numbers of users and networks. At the same time, it led to a rethinking of the process that ARPA was using to manage the evolution of the network. In 1983, ARPA replaced the ICCB with the Internet Activities Board (IAB). The IAB was constituted similarly to the old ICCB, but the many issues of network evolution were delegated to 10 task forces chartered by and reporting to the IAB. The IAB was charged with assisting ARPA to meet its Internet-related R&D objectives; the chair of the IAB was selected from the research community supported by ARPA. ARPA also began to delegate to the IAB the responsibility for conducting the standards-setting process. Following the CSNET effort, NSF and ARPA worked together to expand the number of users on the ARPANET, but they were constrained by the limitations that DOD placed on the use of the network. By the mid 1980s, however, network connectivity had become sufficiently central to the workings of the computer science community that NSF became interested in broadening the use of networking to other scientific disciplines. The NSF supercomputer centers program represented a major stimulus to broader use of networks by providing limited access to the centers via the ARPANET. At about the same time, ARPA decided to phase out its network research program, only to reconsider this decision about a year later when the seeds for the subsequent high performance computer initiative were planted by the Reagan administration and then Senator Albert Gore (D-Tenn.). In this period, NSF formulated a strategy to assume responsibility for the areas of leadership that ARPA had formerly held and planned to field an advanced network called NSFNET. NSFNET was to join the NSF supercomputer centers with very high speed links, then 1.5 megabits per second (mbps), and to provide members of the U.S. academic community access to the NSF supercomputer centers and to one another. Under a cooperative agreement between NSF and MERIT, Inc., the NSFNET backbone was put into operation in 1988 and, because of its higher speed, soon replaced the ARPANET as the backbone of choice. In 1990, ARPA decommissioned the last node of the ARPANET. It was replaced by the NSFNET back bone and a series of regional networks most of which were funded by or at least started with funds from the U.S. government and was expected to become self-supporting soon thereafter. The NSF effort greatly expanded the involvement of many other groups in providing as well as using network services. This expansion followed as a direct result of the planning for the High Performance Computing Initiative (HPCI), which was being formed at the highest levels of government. DOD still retained the responsibility for control of the Internet name and address space, although it continued to contract out the operational aspects of the system. The DoE and NASA both rely heavily on networking capability to support their missions. In the early 1980s, they built High Energy Physics Net (HEPNET) and Space Physics Analysis Net (SPAN), both based on Digital Equipment Corporation's DECNET protocols. Later, DoE and NASA developed the Energy Sciences Net (ESNET) and the NASA Science Internet (NSI), respectively; these networks supported both TCP/IP and DECNET services. These initiatives were early influences on the development of the multi protocol networking technology that was subsequently adopted in the Internet. International networking activity was also expanding in the early and mid 1980s. Starting with a number of networks based on the X.25 standard as well as international links to ARPANET, DECNET, and SPAN, the networks began to incorporate open internetworking protocols. Initially, Open Systems Interconnection (OSI) protocols were used most frequently. Later, the same forces that drove the United States to use TCP/IP -- availability in commercial workstations and local area networks caused the use of TCP/IP to grow internationally. The number of task forces under the IAB continued to grow, and in 1989, the IAB consolidated them into two groups: the Internet Engineering Task Force (IETF) and the Internet Research Task Force (IRTF). The IETF, which had been formed as one of the original 10 IAB Task Forces, was given responsibility for near-term Internet developments and for generating options for the IAB to consider as Internet standards. The IRTF remained much smaller than the IETF and focused more on longer-range research issues. The IAB structure, with its task-force mechanism, opened up the possibility of getting broader involvement from the private sector without the need for government to pay directly for their participation. The federal role continued to be limited to oversight control of the Internet name and address space, the support of IETF meetings, and sponsorship of many of the research participants. By the end of the 1980s, IETF began charging a nominal attendance fee to cover the costs of its meetings. The opening of the Internet to commercial usage was a significant development in the late 1980s. As a first step, commercial e-mail providers were allowed to use the NSFNET backbone to communicate with authorized users of the NSFNET and other federal research networks. Regional networks, initially established to serve the academic community, had in their efforts to become self-sufficient taken on non academic customers as an additional revenue source. NSF's Acceptable Use Policy, which restricted backbone usage to traffic within and for the support of the academic community, together with the growing number of non academic Internet users, led to the formation of two privately funded and competing Internet carriers, both spin-offs of U.S. government programs. They were UUNET Technologies, a product of a DOD-funded seismic research facility, and Performance Systems International (PSI), which was formed by a subset of the officers and directors of NYSERNET, the NSF-sponsored regional network in New York and the lower New England states. Beginning in 1990, Internet use was growing by more than 10 percent a month. This expansion was fueled significantly by the enormous growth on the NSFNET and included a major commercial and international component. NSF helped to stimulate this growth by funding both incremental and fundamental improvements in Internet routing technology as well as by encouraging the widespread distribution of network software from its supercomputer centers. Interconnections between commercial and other networks are arranged in a variety of ways, including through the use of the Commercial Internet Exchange (CIX), which was established, in part, to facilitate packet exchanges among commercial service providers. Recently, the NSF decided that additional funding for the NSFNET backbone no longer was required. The agency embarked on a plan to make the NSF regional networks self supporting over a period of several years. To assure the scientific research community of continued network access, NSF made competitively chosen awards to several parties to provide network access points (NAPs) in four cities. NSF also selected MCI to provide a very high speed backbone service, initially at 155 mbps, linking the NAPs and several other sites, and a routing arbiter to oversee certain aspects of traffic allocation in this new architecture. The Internet Society was formed in 1992 by the private sector to help promote the evolution of the Internet, including maintenance of the Internet standards process. In 1992, the IAB was reconstituted as the Internet Architecture Board, which became part of the Internet Society. It delegated its decision-making responsibility on Internet standards to the leadership of the IETF, known as the Internet Engineering Steering Group (IESG). While not a part of the Internet Society, the IETF produces technical specifications as possible candidates for future protocols. The Internet Society now maintains the Internet Standards Process, and the work of the IETF is carried out under its auspices. ISSUES FOR CONSIDERATION As the Internet continues to grow, the role of the research community in developing and evolving standards needs to be addressed. When the financial implications of decisions about Internet standards were relatively small, the current standards process proved entirely satisfactory. As the financial impact of such decisions becomes increasingly significant, the nature of the standards-setting process will continue to change to allow more direct industrial involvement. How this will ultimately play out is unclear. However, the vitality of the current process derives from the broad involvement of the many communities that have a stake in the Internet. Unlike typical top-down standards-setting operations that implement decisions formed by consensus, the Internet process works essentially in reverse through a kind of grass-roots mechanism. Candidates for Internet standards ordinarily result from actual implementation and wide spread experimentation within the IETF. The most promising of these candidates is selected for placement on the Internet standards track. No better process has yet emerged that is as dynamic and allows as much direct involvement by industry. Further, with the widespread internationalization of the Internet, scores of countries now have fundamental interests in its evolution. Within the United States, the Internet is seen in many quarters as the starting point for the National Information Infrastructure (NII). Around the world, there is growing recognition that the set of NIIs (assuming each country commits to developing one) should be compatible with each other along some still-unknown dimensions. Who should take the lead in ensuring this compatibility? Is this a role for the private sector, for governments acting together, or for some combination of the two? There is clearly a role for government, at least to provide oversight, support, and guidance, if not to participate actively. Apart from these issues is concern about the viability of any approach that has no individual or organization with overall responsibility for its evolution. It seems fair to say that many of the traditional Internet carriers would prefer that new capabilities be provided by them as a turnkey service. Industry surely has the capacity to provide many of the necessary capabilities, but history has shown the importance of government involvement. What guarantees that the same degree of vitality will be part of its future evolution if market forces alone determine what new capabilities are added to the Internet? Furthermore, the Internet offers the possibility of bypassing conventional service offerings by regulated carriers. This may both make it extremely difficult for the regulated carriers to compete effectively in certain areas and make it hard for government regulators to ignore the Internet. Finally, the carriers can only go so far in providing Internet services. Ultimately, the communication pathways must enter the user's machine, pass through layers of software and end up in applications programs. The computer industry, along with the many vendors of computer-related equipment, must play a role in determining how this aspect of the Internet will evolve. The nature of technological innovation almost guarantees that many new technological options will continue to be generated from many different sources and make their appearance throughout the Internet. Thus, it appears that no single entity can possibly be in charge of the Internet. A key to the success of the Internet is to insure that the interested parties have a fair and equitable way of participating in its evolution, including participation in its also-evolving standards process. A proper role for governments would be to oversee this process to make sure that it remains fair and meets the wide spectrum of public needs. An international infrastructure like the Internet will ultimately require countries to set policy on many of the details that are now taken for granted. For example, Internet names and addresses may take on additional legal meanings in the various countries as they rely on the Internet to a greater degree. Trademarks of Internet names and addresses are only one aspect of concern. Contracts of all sorts may have Internet names and addresses embedded within them. How can the countries have confidence in the use of such names and addresses for legal purposes without necessarily assuming responsibility for the day-to-day operation of this aspect of the system? Computer viruses know no national boundaries. If a major "infection" should strike multiple countries, how will those countries work together to respond to such a situation? Finally, the ability to conduct network-based business between countries will require the resolution of many legal issues, including the formalization of legal contracts online and the ability to deal with associated customs and trade-related matters. At its core, the issue of online legal contracts seems to require the use of encryption technology, which has been perhaps the most closely held of all the network-oriented technologies. How can this kind of capability be made available in the international arena in ways that are acceptable to national authorities? More generally, how can issues like those described above, which are likely to arise in the future, be effectively discussed and resolved? Various subsets of these kinds of problems have arisen in the context of other international public networks, including for telephones, and are thus neither unique nor entirely new. As the Internet continues to grow, many of the approaches developed for earlier technologies may apply to the Internet. Some combination of public and private sector involvement will probably be required to deal with these problems more generally. Governments have a fundamental role to play in the funding of advanced research and development that can push forward the frontiers of technology and knowledge. Often, this will involve the development and use of pilot projects to test new ideas in the real world. It also seems clear that governments must provide the necessary oversight to insure that the standards-setting process is equitable. Governments must also take responsibility for helping to resolve problems that arise because of independent decisions made by multiple countries, for example in legal, security, or regulatory matters. In the case of U.S. infrastructure development, the government must provide leadership in many dimensions, including the removal of barriers where they inhibit progress; the insertion of legal, security, or regulatory mechanisms where the national interest so dictates; and the direct stimulation of public-interest sectors, for example in research, education, and certain network aspects of public health, safety, and universal access that require government assistance. Other nations also may find similar incentives for government involvement. Two final observations seem appropriate. First, it will be essential to separate the process by which standards are selected for the Internet from the process by which the variety of possible options are generated. The current situation is almost ideal, since standards are selected by a process akin to ratification only after independent implementation has produced the viable options. This separation needs to be maintained. Second, the most important use of the Internet, and indeed the NII, will be to allow individuals to communicate with each other and to rapidly access information. In many cases, this information will be the intellectual property of others. Every Internet user will also have the opportunity to become a potential provider of information services, thereby vastly increasing the amount of information available. How much of this information may be deemed valuable in a literary or business sense remains to be determined, but much of it may be important in other contexts. It is essential that we sensitize individuals to the value of intellectual property and the need to protect it. This will have the side benefit of encouraging others to develop and make available intellectual property of their own. A combination of ethics, technology, and law are needed to ensure the effective development of this important aspect of the Internet. CONCLUSIONS Over a span of some 20 years, the role of the U.S. government in the evolution of the Internet has changed. While the federal government took the lead in virtually every aspect of Internet in the early days, it currently plays a more limited role. The government is now a major funder of network R&D and provides significant oversight of the evolution of the Internet. It provides direct support or even control for several key aspects of the Internet's operation, such as the assignment of unique names and addresses and the assurance of adequate backbone capability, although it may decide to relinquish some of these responsibilities in the future. It continues to stimulate the development of Internet architecture in healthy new directions. Although the role of the U.S. government in the Internet has been declining steadily for several years, particularly as private-sector interest in the Internet has increased, there is a major continuing set of roles and responsibilities for government to undertake, both in the United States and around the world. Governments must be involved in decisions about how different countries cooperate on various aspects of the Internet and its use, and they must continue to oversee the network's evolution, both nationally and internationally. Other national governments may, but need not, assume the leadership role that the U.S. government has traditionally played in the United States. Without substantial U.S. involvement however, it is doubtful whether the NII will become a reality. And without government involvement on an international scale, it is unlikely that a global information infra structure will emerge or that the Internet will continue to evolve in a vital and dynamic way. Taking a long view, network and computer technologies are still in their infancy, and many of their current uses reflect past practices carried out more effectively in new environments. The real challenge will be for the public and private sectors to work together to harness the still-untapped potential of new and increasingly powerful technologies in the network-based setting of the NII, and to nourish and incubate the powerful, even revolutionary, new ideas that are certain to surface in the future. --------------------------------------------------------------------- [3] Report From INET98 and IFWP-Geneva by Jay Hauben jrh@ais.org From July 20 to 24, 1998, INET98, the eighth annual conference of the Internet Society (ISOC), was held in Geneva, Switzerland. It was followed on July 24 and 25 by a meeting of the International Forum on the White Paper (IFWP). The Internet Society was formed in 1992 "to facilitate and support the technical evolution of the Internet as a research and education infrastructure" (Charter of Internet Society, 2A). It has grown with the Internet and still today there is an increasing number of ISOC chapters being formed continually throughout the world. Even though the current Internet Society leadership is most concerned with the efforts to commercialize and privatize the Internet, there were many attendees at INET98 especially from developing countries and international bodies who defended the value of continuing the public Internet. At the Developing Countries Seminar that preceded the main INET98 sessions, frequent comments were made explaining the need for the involvement of public bodies if the Internet is to spread more universally. One argument was that poor urban and rural people anywhere in the world cannot be Internet customers. However they would benefit from and contribute to the Internet as a communications medium and the Internet could better integrate them into the rest of the world. Historically, the vision of the "library of the future" has been a constructive force contributing to the development of network technology and the Internet. Surprisingly, the world library community seemed sparsely represented at INET98. For example, there were education and health tracks but no track or sessions directly addressing the concerns and contributions of libraries and librarians to Internet development. The importance of the Internet to libraries was stressed however by a library person I met at the conference from Benin, a country in West Africa. He explained that the university library, one of the largest in his country possesses only 23,000 books and 340 periodicals. He made it clear how important Internet access to digitalized books and journals can be to students and scholars in his country. He also spoke about regional isolation in Benin and the value of e-mail as part of a solution to the communications problems between regions. There were eight parallel tracts at the conference in addition to the daily plenary sessions. The tracks were: (1) New Applications, (2) Social, Legal and Regulatory Policies, (3) Commerce and Finance, (4) Teaching and Learning, (5) Globalization and Regional Implications, (6) Network Technology and Engineering, (7) User-Centered Issues, and (8) Health. However, there were no tracks on major public questions like Universal Access, or Community Networks, Freenets and Civic Nets, or Internet and Democratizationtion, or on the history of the Internet. Also, there was no track or discussion on the pros and cons or issues involved in the proposed privatization of the root server and domain name systems. One session of the User-Centered Issues track was devoted to Internet use by people with disabilities. The presentations were almost exclusively arguments and appeals that web pages be constructed with great care. Columnar or crowded web pages or those relying heavily on graphics or illustrations are difficult or impossible to access for people using special readers. For example, page scanners used by people with limited or no sight read a whole single line sequentially even when the page is in columns. Also, many current web pages are especially confusing to people who have learning disabilities. The speakers urged web page creators to view their pages with a lynx text browser or emulator since many people in the world can only access the world wide web via a text browser. Also, sometimes the use of page scanners and other special equipment is only possible with text browsers. Finally, not only in the discussion of access for people with disabilities but elsewhere in the conference a criticism of frames was made. The use of frames it was pointed out sometimes excludes access from older equipment but also does not allow accuracy of bookmarking or ease of printing defeating some of the value of the web. A technical session on "Quality of Service" covered differentiated service. Current routers are not yet but can be programmed to queue arriving packets according to classes of service. Depending for example on how much a sender pays, his or her packets could be given priority over the packets of lower paying senders. This new scheme would allow high band width applications priority treatment while e-mail or library search packets would be queued for later transmission or retransmission. The lower paying users might experience greater delays but real time audio or video might be more successful. Supporters of such differentiated service admitted that the creation of classes of messages is contrary to the history and technology of the Internet which up until now has been egalitarian, but they argued that the technology allows for classes and there are companies that feel they can find customers who will pay higher charges to get higher priority. Such an important change it would appear should not be undertaken without hearing from the whole spectrum of users and future users nor could it be implemented without the consent of most networks which interconnect to make up the Internet. The question remained how would such a change get decided and would it only be possible via coercion. A number of sessions discussed the Internet II project. In this project over 130 U.S. academic and non-academic organizations have joined together to develop a new network that would achieve speeds or bandwidth up to 1000 times that of the current Internet. Academic institutions can join the Internet II consortium for a contribution between $500,000 and $2,000,000 which severely limits participation to the better endowed institutions. Commercial entities can join for a contribution of $25,000 usually in kind. The purpose of the Internet II project is to insure that educational and research users would still have a network even if the current trend toward commercialization and privatization of the Internet might marginalize their access to the current Internet. The strategy is to connect the consortium members with their own network not compatible with the Internet and then win the rest of the world over to their protocols. However, this bifurcation of the Internet may not be easily repairable. E-mail and chat and other common uses of the Internet would stay on Internet I until Internet II protocols were adopted by everyone which also limits the value of Internet II. Despite the rather narrow session topics, the great success of INET98 was the gathering of people from all over the world with overlapping interests in the Internet and its future. Many people were disappointed in the level of the presentations, their lack of historical perspective or technical depth. But there was a tremendous exchange of business cards and e-mail addresses and a sense that the Internet was creating a world community and spreading a new communications technology that could help interconnect the peoples of the world if the communications essence of the Internet were to continue and spread. The International Forum on the White Paper one and a half day meeting held after the INET conference ended was not a planned extension of INET98 but a last minute event. The U.S. government has had oversight and control of the domain name and root server systems that allow all users on the Internet to send messages and packets to each other no matter where they are. This is achieved via a conversion of domain name addresses into numeric addresses. The U.S. government confirmed its intention in a White Paper issued June 5, to end this historic role on September 30 of this year. The White Paper presented by presidential advisor Ira Magaziner had as its purpose the formation of a new private entity to control and manage the root server and domain name systems which are the central control and nerve center of the Internet. The IFWP meeting in Geneva was organized to approve and help give international support and form to the new private organization. The method to achieve such support was to disallow any opposition to privatization. The sessions were chaired in such a way that all opposition and most discussion was discouraged and there were frequent calls for a consensus. Even when it appeared as many as half or more people were confused or openly opposed to proposed structures or powers of the new body the chairs often declared that consensus had been achieved and that the next issue was in order. Since the changes being proposed concern the future of the Internet, e.g., whether it would be the interconnection of different networks or of only networks adhering to commercial concerns about security, they require careful consideration and the hearing of points of view from across the Internet user spectrum. But the IFWP meeting was not set up to allow such democratic procedure. The meeting ended with the declaration by the organizers that a large degree of consensus had been achieved. Those who opposed or disagreed with the process or the purpose of privatization of the nerve center of the Internet left the meeting very frustrated. Another such meeting was planned by the IFWP for Singapore in mid August while other follow up meetings and activities were planned by other forces. The value of these IFWP meetings was that they have alerted a body of people to significant changes that are being planned for the Internet. ----- More discussion on the proposed privatization of the domain name and root server systems of the Internet can be seen in the Amateur Computerist July 1998 Supplement, "Controversy Over the Internet" at http://www.columbia.edu/~jrh29/acn/dns-supplement.txt and http://www.ais.org/~jrh/acn/dns-supplement.txt and by e-mail from jrh@ais.org. Comments are welcomed. ------------------------------------------------------------------- The following articles by Ronda Hauben appeared as "Privatizing the Internet? A Call to Arms" in COUNTERPOISE Vol 2 No 4 Oct 1998, published by the Alternatives in Print Task Force of the Social Responsibilities Round Table of the American Library Association, which actually appeared in March 1999. [4] The Internet: Public or Private? [Editor's Note: The following four articles were part of the ongoing battle to challenge the plan of the U.S. government to privatize the essential functions of the Internet. Instead of the U.S. government determining the proper role to play, it is creating a tangle of illegitimate activities. These articles indicate some of the nature of the problems that are being created.] Something important is happening. The cooperative and open processes and culture that make the Internet a public treasure have their enemies. A contest is going on now where the stakes are high. Will the Internet be able to continue as an open, global, internetwork of networks where diversity is encouraged and communication among people of all ages and from a multitude of backgrounds is made possible? Or will the Internet be transformed into the corporate vision of a large arena for buying and selling and other commercial transactions? The Internet vision allows all to coexist, but the commercial vision will exclude anything but the commercial aims and will require fundamental changes in the nature of the Internet itself. The contest now being waged is over the issue of the privatizing of the Domain Name System and other central and controlling functions of the Internet. Several documents follow. They document the recent struggle to maintain an Internet, and to resist the commercial pressure that certain corporate interests are exerting on the U.S. government to turn these essential functions over to the private sector for its benefit. The Internet is a place where there is a diversity of networks, a diversity of computers and a diversity of users. It is an internetwork of networks which fosters the communication among many and they benefit from this diversity. Also the Internet is based on open code and open and cooperative processes. The processes, however, that have been used by the U.S. government to create a new privatized corporation to own, control and administer Internet domain names, numbers, the root server and the protocols for the Internet have been conducted in secret and via exclusive and closed activities. There has been widespread criticism of the way that the bylaws and articles of incorporation have been created by a nonpublic, and secret process, for this new private corporation, and also there has been criticism about how the selection of those who were chosen for the Interim Board of Directors was carried out. In response to such complaints, the U.S. Department of Commerce required that the Internet Corporation for Assigned Names and Numbers (ICANN) hold an open meeting in Boston, on November 14, 1998. About 200 people from the international Internet community attended as did some members of the press. At the meeting there was a wide-ranging set of complaints about how and why ICANN had been created and what they were doing. Several people pointed out that what was needed was an international public utility, rather than a private sector corporation. The newspaper coverage of the meeting was more extensive than had hitherto happened, and many of the press accounts indicated the large amount of dissatisfaction with ICANN's secret origins and nondemocratic practices. Headlines that appeared in the press following the meeting included the following. (I have indicated the URL where possible.) "New Internet Board Hears Plenty of Skepticism", New York Times, Nov. 14, 1998, http://www.nyt.com/ "Internet Governance Board Confronts a Hostile Public" in the New York Times, on November 16, 1998. http://www.nyt.com/ "A Kind of Constitutional Convention for the Internet", Cyberlaw Journal, October 23, 1998, New York Times on the Web. "Top Candidate for Internet Governance Entity Expects Federal Govt. Approval Within Week," BNA, http://www.bna.com/e-law/ "Debate Flares Over Group That Hopes to Over see the Internet", The Chronicle of Higher Education November 27, 1998, p. A21. http://www.chronicle.com/weekly/v45/i14/14a02101.htm Another interesting press account was that in Forbesdigital on November 30 "Who is Running this Joint?" http://www.forbes.com/tool/html/98/nov/1130/feat.htm A transcript of the November 14, 1998 ICANN meeting is online at http://cyber.law.harvard.edu/archive. Also comments presented before and after the meeting are online at: http://cyber.law.harvard.edu/icann/archive/#comments. On November 25, 1998, a Memorandum of Understanding was signed between the U.S. Department of Commerce and ICANN to design and test mechanisms, methods, and procedures to carry out the DNS functions. This MoU is online at http://www.ntia.doc.gov/ntiahome/domainname/icann-memorandum.htm. There have been some interests pressuring the U.S. government to carry out a transition immediately to the private sector. Others have proposed reasoned consideration to determine a new management structure. Also there are voices urging the need for a continued U.S. government role in the ownership, management, and control of these important and controlling functions of the Internet. The NTIA-ICANN MoU presents a plan for designing a new structure, while maintaining government participation in the process. Thus the battle over what is happening continues. For now the U.S. government is supposed to be maintaining a role in the design and test of a private sector corporate entity to take over these essential functions of the Internet. However, it is unclear what the current U.S. government role is or who to contact in the U.S. government to present complaints to. The U.S. Congress has held hearings about the transfer of these essential Internet functions to the private sector. There is a set of testimony presented to the U.S. House of Representatives Committee on Science, Subcommittee on Technology and Subcommittee on Basic Research which concerns these issues and this testimony is helpful in identifying some of the different positions and issues taken in considering what the U.S. government should do. The house testimony is online at URL: http://www.house.gov/science/hearing.htm#Basic_Research The hearings were on September 25, 1997, March 31, 1998, and October 7, 1998. The testimony of Robert E. Kahn on March 31, 1998, for example, contains important history about the role played by the U.S. government in the creation and development of the Internet. Kahn played a pioneering role in both the designing and building of the ARPANET, and then in the creation of TCP/IP and in designing and building the Internet. The URL is: http://www.house.gov/science/kahn_03-31.htm The DNS battle has turned into a battle over the soul of the Internet. The Internet makes it possible to have networks communicating and therefore people communicating. It provides for a diversity of computers, a diversity of users, and a diversity of networks. And they are all able to cooperate and collaborate. The current actions of the U.S. government to transfer controlling functions of the Internet to the private sector has raised the issue of who should be making the decisions about what happens in the present and future of the Internet? The earliest networking pioneers welcomed all views and all to participate and discuss the issues. Decisions were made by relevant communities at a grassroots level. It was understood that pro and con ideas were needed to have broad ranging discussion to make reasoned and well founded decisions. The current situation is that the Internet is made up of many different networks. There are, however, certain centralized functions. And there is a need to administer them. To do this, great responsibility and skill are needed. Since the Internet is not anarchic, and there are central points of control, great care and responsibility must be exerted or there is the great possibility of abuse of users. Therefore the question of how to make decisions about the Internet has become an urgent issue to be solved. It requires the consideration of all who value the Internet. There are various models one can use to figure out how to make decisions. However, as the Internet is a unique new medium of worldwide communication, it is important to consider what means have grown up with or as part of the Internet that can be helpful in solving this problem.(1) Commercial pressure to allow some small sector of the corporate world to take control of these essential Internet functions makes it difficult for those who care about the future of the Internet to take the needed care to solve the problem. Recognizing that this kind of problem would develop, farsighted computer pioneers in the 1970s like J.C.R. Licklider and Harold Sackman proposed that the development of a internetwork of networks would catch the public by surprise and that providing for the public interest would provide an important challenge.(2) They proposed there would be the need for determining the kind of regulation needed so that the public interest would be protected. Just as they predicted, the social institutions have lagged behind the current developments. Therefore, it is of the utmost importance that those users who are interested in the Internet as a internetwork of networks to be available to all, and to include all the possible diversity of people and computers and networks, take on to learn about this issue and to help spread an understanding of why it is so important. Also the greatest possible participation of the most diverse set of users is needed to determine how to solve the current problems.(3) There is a great need for a broad ranging public discussion on the issues involved in these changes. This is the challenge. The many wonderful experiences and uses of various users around the world who are able to participate online is the gift to be won or lost as a result of the success of this contest. The current battle has made some progress, but battalions of reinforcements are needed to win the war. -------- Notes: (1) See for example the online means of decision making that are described in Netizens: On the History and Impact of Usenet and the Internet by Michael Hauben and Ronda Hauben, IEEE Computer Society Press, 1997. A draft is online at http://www.columbia.edu/~rh120/ (2) See The Information Utility and Social Change, edited by H. Sackman and Norman Nie, AFIPS Press, Montvale, N.J., 1970, pg. 71. See also The Internet: A New Communications Paradigm, by Ronda Hauben, http://www.ais.org/~ronda/new.papers/internet.txt (3) See http://www.columbia.edu/~rh120/other/talk_governance.txt ------------------------------------------------------------------- [5] Report from the Front: Meeting in Geneva Rushes to Privatize the Internet DNS and Root Server Systems by Ronda Hauben ronda@panix.com There is a battle being waged today, one that is of great importance to the future of society, but most people have no idea it is taking place. On July 29, I returned from Geneva, Switzerland where a meeting was held Friday July 24 and Saturday July 25 to create the organization that Ira Magaziner, advisor to the U.S. President, has called for. It is an organization to privatize key aspects of the Internet, the Domain Name System (DNS) and the control of the root server of the Internet. The meeting was the second in a series that are part of the International Forum on the White Paper (IFWP)(1). The U.S. government, with very little discussion by the U.S. Congress, the press or the public, and contrary to the direction of the U.S. Federal District Court (in the case ACLU vs. Reno) is throwing a bone to the private sector and offering them the possibility of making their millions off of the Internet. And while in Geneva, I saw folks from several different countries grabbing at the bone, in hopes of getting themselves some of the same kind of exorbitant profits from selling gTLDs (generic Top Level Domains) that the National Science Foundation (NSF) bestowed on Network Services Inc (NSI) several years ago by giving them the contract enabling them to charge for domain name registration. There is money to be made, or so these folks seem to think, and so any concern for the well being of the Internet or its continued development as "a new medium of international communication" (ACLU vs. Reno) has been thrown to the wind by Mr. Magaziner, IANA (Internet Assigned Numbers Authority) under the direction of Mr. Postel, which has the U.S. government contract to administer the Internet Addresses and Names and to administer the root server, and the others who, without any ethical considerations or social obligations are rushing through this process and squelching discussion and dissent. It is called "consensus" we are told. I went to the session setting up the Names Registry Council provisions for the bylaws of what we are told is to be the new private organization controlling these key aspects of the Internet. At the beginning of the meeting, I made the mistake of objecting when all were asked to register their consensus with the provision for a Names Council. I wanted to hear some discussion so I would know what I was voting on. I was scolded by one participant for asking for a discussion. He claimed that they were *not* here for people who had not read the bylaws proposal that appeared online only a few days before. I had read the bylaws proposal but was naive enough to think that one would hear discussion and clarification before being asked to declare one's adherence. In that way I thought one would know what one was agreeing to. Instead, however, I soon learned that that was *not* how business (or really religion) was being developed in the session I attended. After harassing me for asking for clarification and discussion, the meeting continued. The Chairman asked people to brainstorm and list the functions for the council. When I asked that the activities of the council be reported online and that there be online discussion with anyone interested being allowed to comment on all issues concerning the council, the scribe miswrote what I had proposed. When I asked it be corrected, I was told by the Chair that there was no "wordsmithing" allowed, i.e. that it would not be corrected. After a number of people had listed functions for the council, it was announced that the meeting would vote on the functions to determine if there was "consensus". Then a vote was rammed through on the items. However, instead of counting the numbers for or against each function, there was a declaration of "consensus" if, we were told, it seemed as if there were 60% of those voting who had voted for the listed function. For the first few functions those opposed were allowed to voice their objection. The meeting was being tape recorded, we were told, and there would be a record kept of it. But that soon ended as someone in the room objected to hearing any objections. The Chair said that this was how this was done at the telecom meetings he knew of, as there the players were large corporations with large bank accounts that could afford big law suits. Here, however, it seemed those in control of the meeting judged this was not the case. A short break was called. After the break it was announced that those with objections could no longer voice them on the record during the meeting but were told to come up after the meeting was over. So the vote continued on. Consensus continued to be declared for most of the items voted on, despite the fact there were those indicating their opposition to all of these items. But the record would no longer contain any note of the objections. The Chair and others marveled at the roll they were on. Even though it was time for the meeting to end, one of the Chairs of the Plenary Meeting allowed this meeting to continue as it was on such a roll. Then to the Plenary meeting. Here there was joy and praise for this democratic process from the Chair and spokespersons from the different sessions. When I tried to go to the microphone and say that the consensus in the session I had been in to determine functions for the Names Council represented "no discussion allowed and no noting of those who objected," the Chair of the Plenary Meeting told me I was not allowed to speak there. This all followed the invitation that had been extended in the press lunch on Tuesday, July 21 at INET, where all members of the press were invited to come to the Friday and Saturday sessions of the IFWP and were invited to participate. However, by Friday and Saturday the invitation clearly had changed, especially if one had a question or objection to raise about what was happening. And this is how the supposed new private organization that is to administer and make policy for the Domain Names System that is the nerve system of the Internet and the Root Server System, is being created. No one with any but a private commercial interest (in normal language, a conflict of interest) is to be allowed to participate in the process, no discussion to clarify what people are being asked to vote on is allowed to take place, and no objections could be voiced in the session creating the Names Council, which is one of the crucial aspects of the organizational form, as it is groups with a commercial interest in the sale of gTLDs who have decreed to themselves the right to set policy and recommend actions regarding the gTLDs. What is the significance of this process as a way to create an organization to take over control and administration of the nerve center of the global Internet? The Internet was developed and has grown and flourished through the opposite procedures, through democratic processes where all are welcomed to speak, where those who disagree are invited to participate, and to voice their concerns along with those who agree, where those who can make a single contribution are as welcome as those with the time to continually contribute. (See poster, "Lessons from the early MsgGroup Mailing List as a Foundation for Identifying the Principles for Future Internet Governance" by Ronda Hauben, INET'98.)(2) Also historically, the processes for discussion on key issues regarding the development of the Net are carried out online, as a medium of online communication is what is being built. This is all the opposite of what is happening with the privatizing of the DNS and throwing it to the corporate interests who are the so called "market forces". Here only those who can afford thousands of dollars for plane fare can go to the meetings, and once at the meetings, one is only allowed to participate in a way that registers agreement. At the sessions I attended there was no discussion permitted so no one knows if what they think they are voting on is indeed what it appears to be and there is no opportunity to clarify one's views on an issue as there is no chance to discuss the pros and cons. And for those for whom English is not the first language, or for someone who disagrees with what is happening, there is mockery and the attempt to make them feel unwelcome. This is *not* the way to create a new and pioneering organization to administer and control the nerve center of an international public communications infrastructure that has been built with the tax money and effort of people around the world. When those who have questions or think what is happening is a problem are not allowed to speak, it means that there is no way to know what the problems are to be solved, or what can be proposed that can offer any solution. The U.S. government has initiated and is directing this process with no regard for the concerns and interests of the people online or not yet online. Instead only those with profit making blinders over their eyes are able to stand the glare this rotten process is reflecting. During his speech at the opening session of the IFWP in Geneva, Mr. Ira Magaziner said that the U.S. government no longer has any obligation to protect the well being of the people in the U.S. and he left the room, claiming that the U.S. government would not be involved in the process to create the new organization. But the bylaws of the new organization, made available only a few days before the meeting, and thus not long enough for those traveling to the meeting to have had a chance to study or discuss, were presented by IANA and its lawyer. IANA is the U.S. government contractor proposing the structure of this new "private" organization. Thus the U.S. government is deeply involved in this process but not in any way that fulfills its obligation to provide for the well being of the American people. Meanwhile there is a lawsuit against the NSF brought by a company which sees itself as the MCI of the Internet. The lawsuit claims that anyone who wishes should be able to go into business creating gTLDs. The fact that the DNS is a hierarchical architecture to keep the number of root level lookups for the Internet at a minimum is irrelevant to those bringing the lawsuit and to the U.S. government which is offering out to private sector corporations competition in selling root level gTLDs. And the primary functions rammed through at the July 25 meeting was that the Names Council is being created to make policy and recommendations for how to increase the number of gTLDs, despite the fact that those proposing this structure had a commercial self interest in the issues and thus a conflict of interest in being involved in proposing or setting public policy regarding the future of the Internet. This is the degeneration that the U.S. government's pro commercial policy on the future development of the Internet has led to. There is no concern by Magaziner for the fact that millions of dollars of U.S. taxpayer money (and taxpayer money of people around the world) and effort has gone to create and develop the Internet. The policy of the U.S. government is to try to stop the use of the Internet as a medium of international communication for ordinary people and to deny its technical needs and processes. This is contrary to the directive of the U.S. court that the U.S. government "should also protect the autonomy that such a medium confers to ordinary people as well as media magnates." (ACLU vs. Reno) The next meeting of the IFWP is set for Singapore in August 1998. Magaziner has given this ad hoc self appointed group a deadline to have an interim organization in place by September 30. So the Internet is to be auctioned off as officials in the U.S. government oversee the grabfest. But there are people who care about the Net and its continued growth and development as a medium of international communication. And it is in the hands of these Netizens that any future health of this crucial communications infrastructure that makes possible an unprecedented level and degree of international communication must rest. The public needs to know what is going on and it is important that Netizens find a way to both intervene in this give away of public property and let the rest of the world know what is happening. ----- Notes: (1) The White paper was issued by the U.S. government. It begins: "On July 1, 1997, as part of the Clinton Administration's 'Framework for Global Electronic Commerce' the President directed the Secretary of Commerce to privatize the domain name system (DNS) in a manner that increases competition." (2) Write to ronda@panix.com for copy of the poster. Also see "Netizens: On the History and Impact of Usenet and the Internet", http://www.columbia.edu/~hauben/netbook/ or in print edition ISBN 0-8186-7706-6. ----------- The above report appeared as an appendix in the online version of the Amateur Computerist, July 1998 Supplement "Controversy Over the Internet" available at: http://www.ais.org/~jrh/acn/dns-supplement.txt or via e-mail from jrh@ais.org ---------------------------------------------------------------------- [6] The Internet an International Public Treasure: A Proposal by Ronda Hauben ronda@panix.com Preface In testimony before the Subcommittee on Basic Research of the Committee on Science of the U.S. House of Representatives on March 31, 1998, Robert Kahn, co-inventor of TCP/IP, indicated the great responsibility that must be taken into account before the U.S. government changes the administrative oversight, ownership and control of essential aspects of the Internet that are part of what is known as the Domain Name System (DNS). Kahn indicated that "the governance issue must take into account the needs and desires of others outside the United States to participate." His testimony also indicated a need to maintain "integrity in the Internet architecture including the management of IP addresses and the need for oversight of critical functions." He described how the Internet grew and flourished under U.S. government stewardship (before the privatization -- I wish to add) because of two important components: 1) The U.S. government funded the necessary research. 2) It made sure the networking community had the responsibility for its operation, and insulated it to a very great extent from bureaucratic obstacles and commercial matters so it could evolve dynamically. He also said that "The relevant U.S. government agencies should remain involved until a workable solution is found and, thereafter retain oversight of the process until and unless an appropriate international oversight mechanism can supplant it." And Kahn recommended insulating the DNS functions which are critical to the continued operation of the Internet so they could be operated "in such a way as to insulate them as much as possible from bureaucratic, commercial and political wrangling." When I attended the meeting of the International Forum on the White Paper (IFWP) in Geneva in July, which was a meeting set up by the U.S. government to create the private organization to take over these essential DNS functions September 30, 1998, none of the concerns that Kahn raised at this Congressional hearing were indicated as concerns by those rushing to privatize these critical functions of the global Internet. I wrote a report which I circulated about the political and commercial pressures that were operating in the meeting to create the Names Council that I attended. [See in this issue "Report from the Front, Meeting in Geneva Rushes to Privatize the Internet DNS and Root Server Systems".] But what is happening now with the privatization plan of the U.S. government involves privatization of the functions that coordinate the international aspects of the Internet and thus the U.S. government has a very special obligation to the technical and scientific community and to the U.S. public and the people of the world to be responsible in what it does. I don't see that happening at present. A few years ago I met one of the important pioneers of the development of time-sharing, which set the basis for the research creating the Internet. This pioneer, Fernando Corbato, suggested I real a book Management and the Future of the Computer which was edited by Martin Greenberger, another time-sharing pioneer. The book was the proceedings of a conference about the Future of the Computer held at MIT in 1961 to celebrate the centennial anniversary of MIT. The British author, Charles Percy Snow made the opening address at the meeting and he described the importance of how government decisions would be made about the future of the computer. Snow cautioned that such decisions must involve people who understood the problems and the technology. And he also expressed the concern that if too small a number of people were involved in making important government decisions, the more likely it would be that serious errors of judgment would be made. Too small a number of people are being involved in this important decision regarding the future of these strategic aspects of the Internet and too many of those who know what is happening and are participating either have conflicts of interest or other reasons why they are not able to consider the real problems and technological issues involved. (About the 1961 conference, see chapter 6 of Netizens at http://www.columbia.edu/~rh120 ) What is happening with the process of the U.S. government privatization of the Domain Name System is exactly the kind of danger that C.P. Snow warned against. I have been in contact with Ira Magaziner, senior advisor to the U.S. President on policy with these concerns and he asked me to write a proposal or find a way to put my concerns into some "operational form." The following draft proposal for comment is my beginning effort to respond to his request. Proposal Toward an International Public Administration of Essential Functions of the Internet The Domain Name System Ronda Hauben ronda@panix.com Recently, there has been a rush to find a way to change significant aspects of the Internet. The claim is that there is a controversy that must be resolved about what should be the future of the Domain Name System. It is important to examine this claim and to try to figure out if there is any real problem with regard to the Domain Name System (DNS) that has to be solved. The Internet is a scientific and technical achievement of great magnitude. Fundamental to its development was the discovery of a new way of looking at computer science.(1) The early developers of the ARPANET, the progenitor of the Internet, viewed the computer as a communication device rather than only as an arithmetic engine. This new view, which came from research conducted by those in academic computer science, made the building of the ARPANET possible.(2) Any changes in the administration of key aspects of the Internet need to be guided by a scientific perspective and principles, not by political or commercial pressures. It is most important to keep in mind that scientific methods are open and cooperative. Examining the development of the Internet, an essential problem that becomes evident is that the Internet has become international, but the systems that allow there to be an Internet are under the administration and control of one nation. These include control over the allocation of domain names, over the allocation of IP addresses, over the assignment of protocol numbers and services, as well as control over the root server system and the protocols and standards development process related to the Internet. These are currently under the control and administration of the U.S. government or contractors to it. Instead of the U.S. government offering a proposal to solve the problem of how to share the administration of the DNS, which includes central points of control of the Internet, it is supporting and encouraging the creation of a new private entity that will take over and control the Domain Name System. This private entity will magnify many thousands fold the commercial and political pressures and prevent solving the genuine problem of having an internationally shared protection and administration of the DNS, including the root server system, IP number allocations, Internet protocols, etc. Giving these functions over to a private entity will make it possible for these functions to be changed and for the Internet to be broken up into competing root servers, etc. It is the DNS whose key characteristic is to make the internetwork of networks one Internet rather than competing networks with competing root server systems, etc. What is needed is a way to protect the technology of the Internet from commercial and political pressures, so as to create a means of sharing administration of the key DNS functions and the root server system. The private organization that the U.S. government is asking to be formed is the opposite of protecting the Internet. It is encouraging the take over by a private, non-accountable corporate entity of the key Internet functions and of this international public resource. In light of this situation, the following proposal is designed to establish a set of principles and recommendations on how to create an international cooperative collaboration to administer and protect these key functions of the Internet from commercial and political pressures. This proposal is to create a prototype for international cooperation and collaboration to control and support the administration of these key Internet functions. I. The U.S. government is to create a research project or institute (which can be in conjunction with universities, appropriate research institutes, etc.). The goal of this project or institute is to sponsor and carry out the research to solve the problem of what should be the future of the DNS and its component parts including the root server system. II. The U.S. is to invite the collaboration (including funding, setting up similar research projects, etc.) of any country or region interested in participating in this research. The researchers from the different nations or regions will work collaboratively. III. The researchers will, as much as possible, utilize the Internet to carry out their work. Also they will develop and maintain a well publicized and reachable online means to support reporting and getting input into their work. They should explore Usenet newsgroups, mailing list and web site utilization, and where appropriate RFC's etc. IV. With clearly set dates for completion, the collaborative international research group will undertake the following: 1) To identify and describe the functions of the DNS system that need to be maintained. (The RFC's or other documents, that will help in this, need to be gathered and references to them made available to those interested.) 2) To examine how the Internet and then how the DNS system and root server system are serving the diverse communities and users of the Internet, which include among others the scientific community, the education community, the librarians, the technical community, governments (National as well as local), the university community, the art and cultural communities, nonprofit organizations, the medical community, the business community, and most importantly the users whoever they be, of the Internet. 3) To produce a proposal at the end of a specified finite period of time. The proposal should include: a) an accurate history of how the Internet developed and how the Domain Name System developed and why. b) a discussion of the vision for the future of the Internet that their proposal is part of. This should be based on input gathered from the users of the Internet, and from research of the history and development of the Internet. c) a description of the role the Domain Name System plays in the administration and control of the Internet, how it is functioning, what problems have developed with it. d) a proposal for its further administration, describing how the proposal will provide for the continuation of the functions and control hitherto provided by U.S. government agencies like NSF and DARPA. Also, problems for the further administrations should be clearly identified and proposals made for how to begin an open process for examining the problems and solving them. e) a description of the problems and pressures that they see that can be a danger for the DNS administration. Also recommendations on how to protect the DNS administration from succumbing to those pressures. (For example from pressures that are political or commercial.) In the early days of Internet development in the U.S. there was an acceptable use policy (AUP) that protected the Internet and the scientific and technical community from the pressures from political and commercial entities. Also in the U.S., government funding of a sizeable number of people who were the computer science community also protected those people from commercial and political pressures. f) a way for the proposal to be distributed widely online, and the public not online should also have a way to have access to it. It should be made available to people around the world who are part of or interested in the future development of the Internet. Perhaps help with such distribution can come from international organizations like the ITU, from the Internet Society, the IETF, etc. g) comment on what has been learned from the process of doing collaborative work to create the proposal. It should identify as much as possible the problems that developed in their collaborative efforts. Identifying the problems will help clarify what work has to be done to solve them. h) It will be necessary to agree to some way to keep this group of researchers free from commercial and political pressures government funding of the researchers is one possible way and maybe they can be working under an agreed upon Acceptable Use Policy for their work and funding. This proposal is an effort to figure out what is a real way to solve the problem that is the essential problem in the future administration of the Internet. If the principles and prototype can be found to solve this problem, they will help to solve other problems of Internet administration and functioning as well. -------- Notes: (1) See Michael Hauben, "Behind the Net: The Untold Story of the ARPANET and Computer Science", in Netizens: On the History and Impact of Usenet and the Internet, IEEE CS Press, 1997, p. 109. See also "Internet, nouvelle utopie humaniste?" by Bernard Lang, Pierre Weis and Veronique Viguie Donzeau-Gouge, Le Monde, September 26, 1997, as it describes how computer science is a new kind of science and not well understood by many. The authors write: "L'informatique est tout a la fois une science, une technologie et un ensemble d'outils. Dans sa pratique actuelle, l'introduction de l'informatique a l'ecole, et malheureusement souvent a l'universite, est critiquable parce qu'elle entretient la confusion entre ces trois composantes." (2) Ibid. -------------------- [To discuss the draft DNS proposal "The Internet an International Public Treasure" and other related issues such as the future of the Internet as a new medium of worldwide communication and how to alert others about the current U.S. government privatization plans, you can join the Netizens mailing list. To join the list, send e-mail to: netizens-request@columbia.edu in the body of the message write: subscribe The draft proposal "The Internet an International Public Treasure" is online in English and French at: http://www.columbia.edu/~ronda/other/ -------------------------------- Submitted to the NTIA of the U.S. Department of Commerce by Ronda Hauben, co-author of Netizens: On the History and Impact of Usenet and the Internet published by the IEEE Computer Society Press, 1997, ISBN 0-8186-7706-6 ------------------------------------------------------------------------- [7] Testimony before the Subcommittee on Basic Research and Subcommittee on Technology of the Committee on Science on the subject of Internet Domain Names Rayburn House Office Building U.S. House of Representatives Washington, D.C. 20515 by Ronda Hauben researcher, writer, co-author Netizens: On the History and Impact of Usenet and the Internet October 7, 1998 INTRODUCTION I am pleased to be invited to submit testimony to the House Science Subcommittee on Basic Research and Subcommittee on Technology on the subject of whether the Domain Names System and related essential functions of the Internet should be transferred from U.S. government oversight into a private sector corporate entity. My name is Ronda Hauben. I am co-author of the book Netizens: On the History and Impact of Usenet and the Internet published in May 1997 by the IEEE Computer Society Press. I am also an editor and writer for the Amateur Computerist newsletter which has covered the history and importance of the Internet since 1988. I have studied and taught computer programming and have participated online since 1988 and on Usenet since 1992. Also I submitted the proposal "The Internet an International Public Treasure" to Ira Magaziner and the U.S. Department of Commerce at the request of Mr. Magaziner based on the concerns I presented to him about the narrow phrasing of the question of the transfer of the Domain Name System to the private sector. I also responded to the Green Paper and submitted comments expressing concern that the general nature of the Internet and its history and traditions, and its nature as a communication medium were being lost sight of in the Framework for Electronic Commerce issued by Mr. Magaziner and his staff and in the Green Paper and subsequent White paper. And I attended the Geneva IFWP meeting in July 1998 and wrote up an account of what happened in an article "Report from the Front: Meeting in Geneva Rushes to Privatize the Internet DNS and Root Server System".(1) The proposal that I wrote and submitted to Mr. Magaziner on September 4, 1998, is now one of the three proposals that has been posted at the U.S. Department of Commerce web site by the NTIA with a request for comments. As you can see from my proposal I have found your hearing process valuable and have referred to testimony given by one of the witnesses in this matter in the Preface to my proposal. I want to commend the committee for both holding these hearings and for putting the testimony received on the committee's web site. I want to make a further recommendation, however. I want to recommend that you explore having an online discussion group. There the public could comment on the issues before the Committee and on the testimony received or offer additional information or viewpoints into the public record so that you will have a broader set of information and viewpoints to influence your deliberations, especially when those deliberations concern the operation and future of the Internet. I hope that after you hear the rest of my comments you will understand better why this is so important. HISTORY OF INTERNET First, I would like to offer a bit of history of how the Internet came to be and I will endeavor to show how knowing this history will be helpful in determining how to evaluate the proposals before the NTIA. Then I will provide some recommendations toward the policy decision that this Committee and the NTIA are proposing to make. The Internet is a product of several significant and successful research projects that were conducted under funding from the Advance Projects Research Agency (APRA) in the 1960s and 1970s. One of the earliest of these projects is perhaps one of the most important in its relevance to the problem before this committee today. That project was the creation and support for interactive computing and time-sharing. In 1962-3, a computer scientist and engineering researcher, J.C.R. Licklider was invited to join ARPA and to begin the Information Processing Techniques Office (IPTO). At that time the common form of computing available was known as batch processing using large mainframe computers. Someone who wanted to run a program would bring a stack of punch cards to a computer center and return several hours later or the next day to retrieve the printout that the program generated to see if the program achieved the desired aim. Needless to say this was a cumbersome and frustrating means of using a computer. J.C.R. Licklider and the time-sharing projects that ARPA subsequently funded set out to change the form of computing and to make it possible for an individual to be able to type his or her own program into a computer and to achieve the results of the program immediately. This new type of computing that they created was called time-sharing. Relying on the speed of the computer, these computer pioneers were able to set up a series of different terminals for use by users who were all able to utilize the computer at the same time. As a result of time-sharing systems, multiple users were able to interact directly with a computer simultaneously. One of the projects funded by J.C.R. Licklider was called the Compatible Time-Sharing System (CTSS). It was part of the project funded at MIT by ARPA which was known as Project MAC. There were several important surprises that the pioneers of Project MAC reported from their research into time-sharing: 1) They didn't have to rely on professional programmers to do much of the needed programming for their time-sharing system. What they found was that the participants in the project would create programs and tools for their own use and then make them available to others using CTSS. 2) A community of users developed as a result of the ways that people contributed their work to be helpful to each other. 3) CTSS made it possible for users to customize the computing system to their own needs. Thus the general capabilities available provided a way for the individual user to create the diversity of computing applications or programs that this diverse community of users needed. As a result of this project, the researchers realized that once you could connect a remote terminal to a time-sharing system, you could develop a network with people spread out over large geographical distances. The networks that developed as a result of the research in time-sharing provided working prototypes and also a vision that would help to guide the next stage in the development of networking technology. The effort to improve the throughput of data across telephone lines led to ARPA-supported research in packet switching and the funding of the ARPANET research to use packet switching to link up the computers that were part of ARPA's research program.(2) The next piece of history that is important to consider is the period during which the early Internet was formed. In 1981/1982 a mailing list was begun on the ARPANET. This mailing list was called the TCP/IP Digest and the moderator was Mike Muuss, a research computer scientist at the U.S. Army Ballistics Research Laboratory (BRL). The BRL during this period was one of the ARPA sites making the transition from an early ARPANET protocol, NCP ,to TCP/IP, which was to be the protocol suite that would make an Internet possible. By 1983 the cutover from NCP to TCP/IP had occurred and this made possible a particularly relevant event for the matters under consideration by this committee. That event was the separation of MILNET and the ARPANET into two independent networks to create an Internet. This split would allow MILNET to be devoted to the operational activities of the Department of Defense(DOD). And those on the ARPANET would be able to continue to pursue network research activities. Gateways between the two networks would provide internetworking communication.(3) This gets us to a definition utilized in 1974 by Louis Pouzin, who had worked on CTSS at MIT and then returned to France to work on creating a packet switching network that was called Cyclades. Computer science researcher, Louis Pouzin, defined an Internet as a network of independent networks. (He called "an aggregate of networks [which would] behave like a single logical network" a CATENET. ARPA adopted his concept as the goal of the research project it was supporting).(4) Each network could determine for itself what it would do internally, but each recognized the need to accept a minimum agreement so that it would be possible to connect with others who were part of the diverse networks that made up the Internet. RECOMMENDATIONS I have taken the time to review these two important developments in internetworking history because these two developments are at the foundation of the design of the current Internet as we know it today. These two developments highlight what is so special and particular about the Internet. The Internet that has grown up and developed is a continuation of the time-sharing interactive communities of users and computers where users contribute to and are in effect the architects of the network that they are part of. Also this understanding leads to another significant aspect. That is that this system of human-computer networking partnerships has a regenerative quality. New connections and programs, and data bases or mailing lists are contributed by the users themselves. And thus the Internet grows and spreads and connects an increasingly larger number of computers and users around the world. The second important aspect is that the Internet architecture and design accommodates different needs and capabilities of a diverse set of users and user communities. For example, someone in Ghana with an Intel 386 or 486 based computer and a modem can be connected to and send e-mail to someone in a research laboratory in Switzerland which has the most modern computer workstations. That is because the architecture of the Internet requires the least possible equipment and capability to be able to make Internet communication possible. Thus people and computers around the world who are using an extremely diverse set of equipment and computing capability are able to interact and communicate. I have taken the time to describe these general features of the Internet for a few reasons. The first reason is that this is what is so precious about the Internet and this is what I believe needs to be understood and protected when considering any change that may be contemplated in how the Internet is controlled, managed or operated. Any change in the minimal requirement that makes communication possible across the independent networks that make up the Internet can obsolete thousands of computers and many more users around the world and thereby jeopardize the connectivity and global communication that the Internet has achieved. Any change in the ability of users to represent themselves and to utilize the Internet for their diverse purposes and to contribute to what is available to others on the Internet, (as long as this does not put demands on others on the Internet), any such change can deprive millions of users of the Internet of the general form that makes it possible for the Internet to serve the communication needs of so many diverse communities of users. This diversity includes the computer scientists at MIT or the high school student in Sydney, Australia. If there are particular needs of any one group, such as the security needs of DOD, or the ability to write with Japanese characters of users in Tokyo, the architectural design provides that within an individual network or several networks such needs can be accommodated, without imposing such requirements on the users of other networks. These two principles are important to study and understand because they represent what is being violated by the Framework for Electronic Commerce prepared by Ira Magaziner and his staff. This framework does not treat the Internet as a network of independent networks, but instead as a single network that must be changed to meet the needs of a particular set of users. Thus instead of recommending that an independent commercial network or a few commercial networks be created as part of the Internet to meet the special needs of commercial Internet users, Ira Magaziner's framework document requires that the entire Internet be changed to meet the particular needs of a particular set of users. This is a violation of the concept of an Internet. My recommendation is that the Framework that Mr. Magaziner has created needs to be recast to be a Framework for the Internet as a New Means of International Communication. Within that framework Mr. Magaziner can describe the particular needs of particular communities of users, but these particular needs cannot be allowed to replace the generality of the Internet design so that other users of other independent networks are being imposed on to satisfy the needs of any particular group of users. The second important precaution is that users must be protected to continue to represent themselves and their needs. This is what provides for the diversity of what is available on the Internet and is the continuation of the culture and regenerative quality of the early time-sharing communities. This is what makes it possible for a user in Benin, for example, to spread the Internet to other users there, and for a student in Finland to start the Linux project that has been developed by thousands of others into an operating system that gives Microsoft competition. Those who might want a different type of network, as I have heard some large corporate entities in the United States explain, as they want to be able to more carefully choose who will do what functions for them, can do so in their corporate network as part of the larger Internet, but they must not be allowed to impose their special demands on the larger Internet community. The reason for this is that then users in MILNET, for example, will be required to do things in their network that do not serve their needs, and the concept of an Internet will be violated, leading not to the further growth and extension of the Internet, but back to a single network, to one that serves only a few commercial entities at the great loss to the many other users on the Internet. The other precaution that follows from understanding these essential characteristics of the Internet is that commercial entities want to carry on certain experiments in how to subject various aspects of the Internet to so called "competition". They must not be allowed to do this in a way that affects the whole Internet, but must be restricted to the particular network that they develop for their commercial purposes. Thus the commercial corporation that is being planned by the U.S. government to sell off parts of the Internet's essential functions must not be allowed to control anything but its own commercenet. Those who are interested in such experimentation should be advised that they will have to form their own network which can be connected to the Internet, but that such experiments can only go on inside their own network, and cannot be imposed on the rest of the users of the Internet. To do otherwise is to jeopardize the fact that only a minimal requirement is necessary for all to connect to the Internet and this is only that which makes the communication across the many independent networks that make up the Internet possible. To do otherwise will mean the obsoleting of many machines and cutting their users off from communication with the rest of those on the Internet. Thus the corporation that IANA and NSI have designed, or that the Boston Group has proposed must not be allowed to take over the essential functions of the entire Internet. Instead such corporate activity needs to be restricted to an independent commercial network that can be part of the Internet but cannot be allowed to impose its special requirements on the others who use the Internet. This might mean that the .com machines will become part of a .com network, and would be able to communicate with others on the Internet, but not impose their "for sale" and speculative practices on the users in the educational or scientific communities who make up much of the Internet. Before there are any plans to change the form or structure or management of the Internet, it is crucial that there be an assessment of the special characteristics and functionality that must be preserved and a plan created for how to be certain that this is done. Since both the IANA/NSI proposal and the Boston Group proposal are for structures that should be limited to a commercial network, and not imposed on the Internet itself, how then can the essential functions of the Internet be administered in a way that represents the cooperative and international nature of the Internet itself? My proposal provides for a prototype cooperative research program involving researchers in any country or region that agree to participate. These researchers who will be part of this program are to be responsible for carrying out the investigation and inquiry among online users to determine the general characteristics and functions so that they can propose a plan to safeguard these crucial characteristics and functions. There is one final lesson from the history and development of the Internet that it is important to consider when trying to determine how to form a more international system for protecting and administering the essential functions of the Internet represented by the Domain Name System, IP numbers etc. Usenet was begun in the 1979-80 period by graduate students who were part of the Unix community. The invitation to join Usenet which was handed out at the January 1980 USENIX conference explained why it was crucial to develop an online network, not to form committees. They describe why it was crucial for those who were interested in developing Usenet to actually use the network, so that they "will know what the real problems are." It is with this goal in mind that I created the design in my proposal for a prototype where researchers from a diverse set of nations or regions will utilize the Internet to figure out how to create the necessary cooperative, protective forms and processes to administer and support the essential functions of the Internet. Just as adhering to the principle of relying on "using Usenet" made it possible to grow Usenet, so the principle of "using the Internet" will make it possible to scale the Internet and create a means for a shared international oversight of the essential functions and to solve the problems that arise along the way. The Internet is the symbol and manifestation of hope for people around the world. As more and more people communicate on a worldwide basis, the foundation is increasingly set to find peaceful and productive ways to solve the many serious problems that exist in the world today. This vision has its enemies. But the U.S. government has the proud distinction of being the midwife of the achievement of achievements of the 20th Century represented by the development of the Internet. If there are those in the U.S. government who recognize the importance and respect that comes from giving birth to the communications system that has spread around the world with such amazing tenacity and determination, they must find the means to treat the decisions and changes needed to further develop the Internet with the proper care and concern. --------- Notes: (1) http://www.columbia.edu/~rh120/other/ifwp_july25.txt (2) See chapter 6 "Cybernetics, Time-Sharing, Human-Computer Symbiosis and Online Communities" in Netizens: On the History and Impact of Usenet and the Internet, IEEE Computer Science Press, 1997. A draft is available at http://www.columbia.edu/~hauben/netbook (3) Describing this transition, Vint Cerf wrote: "The basic objective of this project is to establish a model and a set of rules which will allow data networks of varying internal operation to be interconnected, permitting users to access remote resources and to permit inter-computer communication across the connected networks." (4) Robert Kahn at about the same time introduced the "open architecture" principle. For Pouzin's work see e.g., Louis Pouzin, "A Proposal for interconnecting packet switching networks," EUROCOMP Conference, Brunel Univ, May 1974, p. 1023. (The article was reprinted in "The Auerbach Annual 1975 Best Computer Papers", Isaac Auerbach Ed, pp. 105-117.) -------------------------------------------------------------------- [8] Letter To Representative Tom Bliley Representative Tom Bliley Chairman The House Committee on Commerce The U.S. House of Representatives Washington, D.C. commerce@mail.house.gov Dear Chairman Bliley, It was good to see your letters of October 15, 1998 to Ira Magaziner, Senior Advisor to the President for Policy Development and William M. Daley, Secretary of Commerce, asking for information regarding the proposed transfer of vital public resources necessary for the functioning of the Internet from the oversight and control of the U.S. government to a newly to-be-created private entity. It is important that there be a serious examination and investigation of this plan by the government. As I will explain in more detail below, these public resources that the U.S. government is offering to give to a private entity will put great wealth and power in the hands of that private entity and will seriously jeopardize the public character and cooperative nature of the Internet. It is this public character and cooperative nature that are essential for the continued functioning of the Internet, as I explained in my testimony to Congress, submitted to the Committee on Science, subcommittees on basic research and technology for their hearing held on October 7, 1998. The testimony is a part of the public record and is also available at http://www.columbia.edu/~rh120/other/testimony_107.txt There are some concerns I feel it is important to indicate to you and I would appreciate an opportunity to talk with you further about them. In February 1997, a report was issued by the National Science Foundation Office of the Inspector General. (See "Office of Inspector General Report: The Administration of Internet Addresses," 7 February 1997) This report contained a number of interesting observations and recommendations that it presented to the National Science Foundation to examine with regard to the important question of the future oversight, control and management (i.e. policy determinations) of the domain name system and the IP numbers, root server system etc. Instead of the NSF examining the report and the recommendations made, the agency went ahead with actions to privatize the DNS and related systems, transferring the oversight over key functions of the Internet to the U.S. Department of Commerce. And despite the fact that there have been congressional hearings conducted by the House Committee on Science, subcommittees on basic research and on technology and the House Commerce Committee into the privatizing of the DNS and related systems of the Internet, none of these hearings has mentioned the Office of Inspector General's Report or the recommendations and precautions discussed in the report. Also in its semi-annual report to Congress, the Office of Inspector General of the NSF made further comments and recommendations. And it said it was referring the problems it had identified of concentration of power that such privatization would represent to the U.S. Department of Justice for examination. (See "Semiannual Report to Congress, Number 16, October 1, 1996 through March 31, 1997, pg. 10-14.) The Report explains: "NSF responded to our report by stating that 'long term issues raised by [our] recommendations may indeed require additional government oversight.' Nonetheless, NSF decided it would not be appropriate for NSF to continue its oversight of Internet address registration, and it referred our report for consideration by an informal interagency task force chaired by OMB. NSF explained that '[i]n the meantime, next-step solutions are being implemented,' citing the proposals discussed above that would create new, top-level domain name and number address registries. We believe these proposals could result in a concentration of market power and possible anti- competitive behavior. As a result, we are referring these matters to the Antitrust Division of the Department of Justice for analysis and suggested disposition." (p. 15) I wondered why there hasn't been any apparent consideration by the Executive Branch or the U.S. Congress of the NSF "Inspector General's Report on the Administration of Internet Addresses" which was issued in February, 1997. Though the report doesn't solve the problem, it does make a significant contribution toward understanding the problem. It identifies the fact that continued research to meet the needs of the Internet is a responsibility for government. And it describes that there is a public obligation of the U.S. government with regard to ensuring the protection of the public interest in the public resource and public treasure that is the Internet. The report says this in different ways at different places throughout but at the end it says: "The current federal oversight of name and number Internet addresses is the natural consequence of federal financial support of Internet development. Continued federal oversight of this unique public resource is required by the nation's increasing dependence on the Internet, which is being fostered by additional federal investments in this technology. NSF's history of involvement with the Internet, its technical expertise, and its continuing investments in related research programs uniquely qualify it to perform that oversight role. NSF's oversight would ensure the protection of the public interest in the resource, the availability of funds to support future network- related basic research, service, and development, fairness to the Internet community, and fairness to the taxpayers." (from page 16 of "Office of Inspector General Report: The Administration of Internet Addresses," 7 Feb. 1997) The Report also identifies the significant amount of money that the $50 a year maintenance fee in domain names has given to the U.S. government contractor Network Solutions, Inc. The Report suggests using part of the fee to support continued needed networking research. (I feel there would have to be serious questions raised about whether this is appropriate, but it is important to examine this recommendation.) In any case this suggestion clarifies that those who administer the Internet also have an obligation to support the kind of research needed to help the Internet to scale. Also the Report identifies the potential of charging for IP numbers and the great amount of revenue that this could potentially yield. (This raises for me the question of the enormous power that will be put in the hands of any private entity that is given control over the allocation of IP numbers and domain names.) The Report also notes that policy issues which are issues of control need to be kept in government hands, not given over to private hands. The OIG report discusses that though it might be possible to move administrative functions out of government hands, it must be clear these are not policy functions. The proposed privatization of the DNS and other essential Internet functions are moving policy functions out of the control of government and putting them into unaccountable hands. The whole result of this is a very dangerous one both for the public around the world and for the Internet. The reason is that the private entity has no public obligation or the tools or functions to enable it to sift through the opposing interests with regard to policy. The private entity (and I have seen this in all the efforts I have made to be part of the International Forum on the White Paper activity) has no concern for the public interest. The issue is never raised and can't be. There is a reason government has been created and that governments exist around the world. There is a broad interest that is more long range than what an individual corporation is able to consider or act in favor of. After reading the Inspector General's Report, I thought for a few minutes about the fact that over two billion IP numbers have already been allocated and that there are over two billion more. I thought about the tremendous power and wealth that this could represent as well as the harm that would come to the Internet if this power and control falls into the wrong hands. If the new private entity decides to charge just $50 a year for each IP number, then that gives it a yearly income of 100 billion dollars. If it makes a decision on who can buy IP numbers and who can't, then this limits access to the Internet to those whom this private entity deems should have access. Thinking about this potential being put into the hands of a private entity with no expertise to deal with it and more importantly, no social obligation toward either the Internet or the public, left me recognizing in a new way how the development and spread of the Internet is due to the fact that the policies involving its development had a public purpose and responsibility, and were under government protection. To transfer this great potential public treasure into private hands who consider it a "gold mine" represents a very very great disregard of the public trust and public obligation. I have heard that there are those willing to pay to get these resources and that they are upset that they are being given away free. Those willing to pay didn't recognize this, but they did recognize that this is a case of the U.S. government giving away something that has very very great value (either private value if it falls into private hands) or social value if it is kept in public hands. So this is the issue that hasn't been discussed and yet this is a very significant public question. When I was asked to submit questions to Congress by the staffer with the House Committee on Science, Subcommittee on Basic Research, one of the questions I submitted was "By what authority is the U.S. government giving away the cooperative development that is represented by the Internet." I have read RFC's like RFC 1917 which says the Internet "is the largest public data network in the world." And later on it defines the global Internet as "the mesh of interconnected public networks (autonomous systems) which has its origins in the U.S. National Science Foundation (NSF) backbone, other national networks, and commercial enterprises." So it defines the Internet as "public" *not* private. And yet the U.S. government is claiming it is considering giving to a private entity the essential functions that are at the heart of this global public internetwork of networks. The attempt to transfer vital public resources out of the protection of the public sector into an entity that allows their fundamental nature and purpose to be changed, presents a fundamental problem and challenge for those who understand the importance and advance for society represented by the worldwide Internet. Even the U.S. Federal District Court, in a case affirmed by the U.S. Supreme Court, recognized the unique and important treasure that the Internet represents for people around the world and directed the U.S. government to protect the autonomy that the Internet makes possible for ordinary people as well as media magnates.(ACLU vs. Reno) The privatizing of these essential functions makes such protection impossible. When I was at the hearing held by the House Committee on Science, subcommittees on basic research and technology on October 7, 1998, the head of the steering committee of the International Forum on the White Paper spoke to the subcommittee about her vision of having private corporate entities take over the power and control that government has had. This helped me to understand that the question of governance is being substituted for the question of what is the proper role of government in the administration of important and strategic public resources like the Internet. The OIG Report mentions two ways to protect the public interest with regard to public resources. The first is to keep them under public ownership and control. The second is to follow "procedures for facilitating public participation and open decision making." They recommend that with regard to this responsibility the "NSF should disseminate the draft policies and requests for comments broadly, on the Internet as well as via traditional means, and NSF should accept comments via the Internet." (p. 12) They also mention that when the NSFNET was privatized the NSF went through a public process. Unfortunately, they don't recognize how this public process broke down at that time. (See chapters 11, 12 and 14 of Netizens: On the History and Impact of Usenet and the Internet at http://www.columbia.edu/~hauben/netbook/ ) Once again the public processes are not functioning, as demonstrated by my report of the IFWP phony consensus process. See http://www.columbia.edu/~rh120/other/ifwp_july25.txt I welcome any thoughts on all this. I recognize that these issues are not easy for those in government, but the momentous importance of them requires the most skillful and considered measures. Several years ago I met one of the pioneers of time-sharing, Fernando Corbato. I asked him about his early experiences at MIT and Project MAC. He recommended that I read the book "Management and the Future of the Computer" edited by Martin Greenberger. [Later reissued as "Computers and the World of the Future] The book was about the 1961 conference at MIT on what should be the future of the computer. Many of the pioneers who had created the computer or were working on forefront computer research had gathered to celebrate the centennial of MIT. They invited C.P. Snow from England to speak. (He had recently spoken at Harvard). His topic was "Scientists and Decision Making". And he spoke about how strategic decisions, especially those concerning computer technology, would be made by government officials. His talk explained why it was crucial that those officials had the needed advice from people who understood the technology and the consequences to society of their decisions. Also he spoke about the need to involve the broadest possible number of people in these decisions. C.P. Snow gave the example of when strategic decisions involving too few people were made in England and how the decisions led to harmful social results. (He cited the decision to do the strategic bombing of German civilian populations and he told how that decision prolonged the war, rather than shortening it as intended.) And he spoke about how decisions involving a large number of people had more of a chance of being socially beneficial decisions. The plan of the U.S. government to privatize essential functions of the Internet is the kind of decision that C.P. Snow was warning against. It is good to see that you, as the Chairman of the House Commerce Committee have now begun an investigation into some aspects of the U.S. government plan to privatize these key and invaluable public resources. It is important that such an investigation examine the concerns of the Report of the Office of Inspector General of the NSF on the planned privatization and conduct a much broader investigation into the public and social consequences and dangers giving any private entity the power and wealth that such key functions of the Internet provide. In the spirit of citizenship and Netizenship, Ronda Hauben ronda@panix.com P.S. The proposal I have presented to the NTIA is also available at the U.S. Dept. of Commerce NTIA web site and you should be aware that that is *not* a proposal to privatize these key functions, but to create a prototype collaborative network to examine and solve the problems of scaling and continuing the successful operation of the Internet. ------------------------------------------------------------------------- [9] E-mail Message from Becky Burr to Ronda Hauben [Editor's Note: In response to the proposal that Ronda Hauben submitted to Ira Magaziner at his request and to the U.S. Department of Commerce, there was a phone call from Becky Burr. Her only real question about the proposal that had been submitted by Ronda Hauben was what could be inserted into the IANA proposal to take into account some of the concerns raised by Hauben's proposal. When Hauben answered that government had to stay involved and thus she couldn't propose inserting something into a proposal that excluded government involvement, the issue was not discussed any further. Following is the subsequent brief e-mail reply that Ronda Hauben received from Becky Burr as the only real statement of their consideration of her proposal.] Date: Tue, 20 Oct 1998 18:29:09 -0400 From: Becky Burr To: ronda@panix.com Cc: krose@ntia.doc.gov Subject: DNS management Dear Ms. Hauben: Thank you for making your submission in response to the National Telecommunications and Information Administration (NTIA) Statement of Policy entitled Management of Internet Names and Addresses. The public comments received by the Department of Commerce, in response to your submission and others, generally support moving forward with the structure outlined by the Internet Corporation for Assigned Names and Numbers (ICANN). The public submissions and comments received, however, also indicate that significant concerns remain about the substantive and operational aspects of the ICANN. In this light, we have indicated to ICANN the need to resolve a number of specific concerns including accountability (financial and representational), conflicts of interest, transparent decision-making, and country-code top level domains (ccTLDs). We are hopeful that a satisfactory resolution of these issues, leading to the creation of a broader consensus, can be achieved in the near term, in order that we may move forward with the transition process outlined in the White Paper. Although you do not agree with the privatization plan, we understand and share your concerns about preserving the Internet's potential to further scientific and research activities. We appreciate your thoughtful and constructive participation in this process. Sincerely, J. Beckwith Burr Associate Administrator (Acting) ---------------------------------------------------------------------- [10] Letter to William Daley Secretary of Commerce [Editor's Note: Following is the letter that Congress man Tom Bliley, Chairman of the House Committee on Commerce sent to both Secretary of the Department of Commerce William Daley and Ira Magaziner, then Senior Policy Advisor to President Clinton, on October 15, 1998. Congressman Bliley indicated his committee was beginning an investigation into the secret process by which the U.S. government through IANA had created ICANN. However, there has been no further indication of the process of this Congressional investigation and no indication of whether the U.S. executive branch did submit the documents that Congressman Bliley requested. Also there was no response by Congressman Bliley to Hauben's letter of request for an investigation of the lack of consideration of her proposal by the U.S. Department of Commerce.] October 15, 1998 The Honorable William M. Daley Secretary of Commerce U.S. Department of Commerce 14th Street at Constitution Avenue, NW Washington, D.C. 20230 Dear Mr. Secretary: I am writing to express my concerns about the role of the Department of Commerce in the transfer of the Internet's Domain Name System (DNS) from the public sector to the private sector. On June 10, 1998, the Subcommittee on Telecommunications, Trade and Consumer Protection held a hearing on the future of the Domain Name System. Associate Administrator of the National Telecommunication and Information Administration (NTIA) for International Affairs, J. Beckwith Burr, testified on the Administration's recently released policy statement on the future management of the DNS. This policy statement, known as the White Paper, outlines the Administration's proposal to turn over responsibility of the management of the DNS from the government to a newly created non-profit corporation. This new private corporation is intended to provide for competition in domain registration and global participation by all interested parties in the future management of the DNS. I welcomed the White Paper's proposal for the new corporation to be "governed on the basis of a sound and transparent decision-making process, which protects against capture by a self-interested faction." The White Paper reiterated the need for openness when it stated that: "The new corporation's processes should be fair, open and pro-competitive, protecting against capture by a narrow group of stake holders." At the hearing, I underscored the importance of private sector leadership and the need for stability and continuity in the operation of the Internet during the transfer of DNS management to the private sector. I believed that an open, consensus-based process to develop the new self-governing structure, embodied in the White Paper, was a promising approach. At the meetings over the summer of the International Forum for the White Paper (IFWP), a broad-based consensus was reached among the participants which echoed the principles of the White Paper. To further the goals of the White Paper, it would seem incumbent upon the Administration to encourage all key Internet stakeholders to participate in an open, consensus-driven governance process, and, in particular, to encourage meaningful participation of one important stakeholder, the Internet Assigned Numbers Authority (IANA). As you know, IANA, a Department of Defense contractor, establishes technical protocols and allocates Internet Protocol (IP) addresses to regional IP numbering authorities, two functions that are critical to the operation of the Internet. I was disappointed to learn that IANA apparently did not meaningfully participate in the IFWP process. Instead of participating in that process, IANA, under the leadership of Dr. Jon Postel, apparently developed its own DNS reform proposal behind closed doors with little consultation from the broader Internet community. The final IANA proposal, which was delivered to the Department of Commerce on October 2, only represented the position of IANA and no other parties. Concurrent with IANA's release of its proposal for the new DNS corporation, known as the Internet Corporation for Assigned Names and Numbers (ICANN), IANA named nine individuals to serve as interim members of the board of directors of ICANN. I am concerned about the lack of openness in the consideration and selection process for ICANN's interim board members. In fact, Dr. Postel's written testimony recently before a House Committee acknowledged that the selection process for members of the interim board of directors of the new corporation to administer the DNS, was "undemocratic and closed." Further, I am concerned that the lack of a solid American majority on the interim board fails to reflect the leading role of American business investment and consumer-use in the growth of the Internet. The Commerce Department has provided a comment period of just six business days (which began with the receipt of the proposals late on October 2, and ended on October 13, 1998), for the public to respond to the four proposals submitted to NTIA pursuant to the White Paper's request for proposals to establish a private sector entity. I am concerned that this limited time period is inadequate for all interested parties to provide meaningful comment on these proposals that are crucial to the future of the Internet and electronic commerce. Finally, I have concerns regarding the legal authority upon which the Department has undertaken the process to transfer DNS management from the National Science Foundation (NSF) to a newly created non-profit corporation. As you know, the NSF took the lead in commercialization of the Internet through its operation of the NSFNET and its 1993 cooperative agreement with Network Solutions Incorporated (NSI) to register domain names and manage the root server system. It is my understanding that the NSF/NSI cooperative agreement was transferred to the Department of Commerce in September 1998. I am concerned about the manner in which the process of privatizing the governance of the DNS has apparently unraveled. I was hopeful that the Administration would bring leadership to this important effort. We are at a critical juncture in the efforts to establish a workable governance structure that will guide the future of the Internet and electronic commerce. The success or failure of this current undertaking will have a profound impact on the growth of electronic commerce as well as future Internet governance debates. It is vitally important that this first attempt at self-governance be undertaken in a deliberate, open and fair manner, so that it is not subject to capture by "a narrow group of stakeholders." A loss of credibility in the Internet community at large will seriously under mine the ability of the new corporation to administer the Domain Name System and the stability of the Internet itself. Pursuant to Rules X and XI of the U.S. House of Representatives, I request that you provide the following information to the Committee by November 5, 1998. 1. Please provide the Committee with an explanation, including citations to relevant statutes, of the Administration's authority over management of the Internet. In particular, please explain: (1) the Department of Commerce's authority to assume the NSF cooperative agreement with NSI; and (2) the Department of Commerce's authority to transfer responsibility for the management of the DNS to the private sector. 2. Given IANA's historical role in the operation of the Internet and its role in establishing a new management structure, please describe the Department of Commerce's efforts to encourage IANA's meaningful participation in the IFWP process. Additionally, please describe the Department's knowledge and/or involvement in IANA's decision to submit its own proposal. Please provide all records relating to IANA's participation in the IFWP or IANA's decision to submit a separate proposal. 3. Why is the Department of Commerce's comment period so short? Why did the Department provide just six full business days for the public to analyze the proposals and provide comment? Please explain the Department's regulations and guidance governing public comment periods generally and in relation to the consideration of the four DNS reform proposals together with the relevant regulations and guidance. 4. Did the Department of Commerce have any involvement in the consideration or selection of ICANN's proposed interim board members? If so, please describe the Department's involvement and list and describe any communications the Department had with the following people or entities regarding the consideration or selection of the proposed interim board members prior to the announcement of the proposed interim board members: (1) IANA or its representatives; (2) the proposed interim board members; (3) representatives of foreign governments, international organizations, or non-governmental organizations; or (4) other individuals and organizations outside the U.S. government. Please provide all records relating to such communications (whether written, electronic or oral). For purposes of responding to this request, the term "records," "relating," "relate," and "regarding" should be interpreted in accordance with the Attachment to this letter. Should you have any questions regarding this request, please contact me or have your staff contact Mark Paoletta, Chief Counsel for Oversight and Investigations, or Paul Scolese, Professional Staff Member, at (202) 225-2927. The House Commerce Committee intends to monitor the consideration of the draft proposals and the transfer of DNS management to the private sector very closely for the remainder of the 105th Congress and throughout the 106th Congress. As the Administration undertakes this effort, I ask that the Committee be kept informed of and consulted on the process in a timely fashion. Sincerely, Tom Bliley Chairman, House Committee on Commerce ------------------------------------------------------------------------ [11] Letter from Rep. Tom Bliley to Ira Magaziner [Editors' note: Following is a letter sent by Congressman Tom Bliley to Ira Magaziner, then a senior U.S. policy advisor to President Clinton. Magaziner resigned from his office in November 1998. We are including in this letter only the parts that are different from those that were included in the letter to the Secretary of Commerce which appears above.] Dear Mr. Magaziner: I am writing to express my concerns about the Administration's role in the transfer of the Internet's Domain Name System (DNS) from the public sector to the private sector. 1. ... 2. Given IANA's historical role in the operation of the Internet and its role in establishing a new management structure, please describe your efforts to encourage IANA's meaningful participation in the IFWP process. Additionally, please describe your knowledge and/or involvement in IANA's decision to submit its own proposal. Please provide all records relating to IANA's participation in the IFWP or IANA's decision to submit a separate proposal. 3. Did you support the Department of Commerce's decision to limit the public comment period on the DNS proposals to six full business days? Please provide all records relating to the comment period, including but not limited to all records of communications (whether written, electronic or oral) between the Executive Office of the President and the Department of Commerce relating to the comment period. 4. Did you have any involvement in the consideration or selection of ICANN's proposed interim board members? If so, please describe your involvement and list and describe any communications you had with the following people or entities regarding the consideration or selection of the proposed interim board members prior to the announcement of the proposed interim board members: (1) IANA or its representatives; (2) the proposed interim board members; (3) representatives of foreign governments, international organizations, or non-governmental organizations; or (4) other individuals and organizations outside the U.S. government. Please provide all records relating to such communications (whether written, electronic or oral). ------------------------------------------------------------------- [12] Letter to The NTIA Date: Wed, 7 Oct 1998 14:07:51 -0400 (EDT) From: Luis G de Quesada To: dnspolicy@ntia.doc.gov Subject: Against Privatization of the Internet Dear Sir/Ladies: I am in favor of Ronda Hauben's proposal and against the privatization of the Internet. The Internet belongs to we, the people and privatization would gradually remove us from it, making room in it for just the privileged and the wealthy. Sincerely, Lou De Quesada ---------------------------------------------------------------------- [13] Internet Governance: Herding Cats and Sacred Cows* Version 1.1 By Robert Shaw** robert.shaw@itu.int [Editor's Note: The following article is based on the talk given by Robert Shaw, of the ITU, in Geneva at the Internet Society Meeting in July 1998. Shaw discusses some of the background of how the process of trying to turn the Domain Name System and other Internet essential functions over to the private sector has been a frustrating process that has only yielded undesirable ends.] A few days ago, I gave a talk at the ITU to a group of students on a European telecommunications summer school program. The pre- arranged topic of my talk was "Internet governance". Of course, I started my talk by saying that I hadn't the slightest idea what the term "Internet governance" meant. You would think I might. During the last couple of years, I, along with a current committee of around thirteen people, have been involved in what can only be described as a three-ring circus: an attempt to overhaul the administration of the Internet generic top level domains like .com, .net, and .org. When a smaller first committee, the Internet Ad Hoc Committee or IAHC started this work in 1996, I doubt that any of the IAHC had ever heard of the term Internet governance. In fact, we were very careful to limit the scope of our activity and would have been accused of absurd hubris to equate this work with the much grander sounding "Internet governance". Someone once said "trying to govern the Internet is like trying to herd cats: it just doesn't work". And as someone else noted "cats are clearly much smarter than dogs: the proof is that you could never tie eight cats together and get them to pull a sled in one direction". One could argue that what we need is a few dogs pulling in the same direction. But, of course, on the Internet, no one knows if you're a dog. I, along with another rotating group of committee members working on this problem, have experienced enough bizarre characters, self-proclaimed representatives of organizations that are nothing more than a few web pages, and conspiracy theories to last a lifetime. We've been sued, attacked in thousands of e-mails on mailing lists, compared to communists against free enterprise, claimed to be lackeys of foreign powers, or part of a secret plot to move the Internet to Switzerland. No motive that we could possibly have is too base. No possible accusation has been left unsaid. I've read enough false press reports about our work to forever distrust quasi-real-time web journalism. Indeed, who has time to check sources when you need to publish next hour? We've been accused of selling out to the trademark community and at the same time not doing enough to help protect trademarks in domain names. We've been chastised because we haven't figured out a way to put principles of free speech into domain name administration [personally, I would have thought that the Internet offered plenty of opportunities for free speech without having to embed in its naming infrastructure]. We've been told that we're progressing too fast and too slow. And, of course, the incumbent administrator of gTLDs operating under a five year contract that should have ended on September 30, 1998 [now extended to September 2000], is, shall we say, not particularly keen on any plan that threatens a monthly multi-million dollar revenue stream or their market capitalization. Basically we're making everyone unhappy which ironically may mean that we've reached an equal compromise between wildly divergent points of view. Unbelievably, it just seems to just get worse and worse. When we started our work in 1996, only a few people outside the Internet technical or service community cared about domain names. Now almost every week, there is a new trade association, advocacy group, trademark lawyer, Cyber-libertarian, academic or bored teenager with a 15 dollar a month dial-up account who surfaces and decides that they too need to join in and add their two cents to this topic. We're "stakeholders" too they say. "Our views also need to be represented". The first problem is that each time these new people surface, they suggest the same unworkable solutions that have been discussed to death and long ago put to bed so a great deal of time and effort is spent rehashing covered ground. The second problem is that with a shift of focus to Internet governance, there are many who, for whatever reason, interpret self-governance as a wonderful opportunity for self-promotion. To those I issue you this warning: there is no glory here. It is a thankless job. What some people have forgotten is that the urgency of our original work came from the Internet operational community. When we started, there was a very real danger of the domain name system fragmenting into multiple roots which most believe would have been a terrible disaster for the Internet. The consequence would be equivalent to dialing the international direct dialing code 41 and being routed to Switzerland one day and Kenya the next. Fortunately, this danger now seems to have somewhat faded. When we prepared our plan, we issued a request for comments and synthesized thousands of ideas into what we thought was the best compromise solution. We thought that the force of good ideas and sound principles would be sufficient to get to the holy grail of consensus and move forward. We issued more requests for comments to tune our work. We attended scores of meetings to meet with people and discuss what they were seeking. We provided almost daily updates of information on our web site so that people could understand what we were doing. We maintained mailing lists of thousands of subscribers. How this debate has progressed into a debate on Internet governance has been totally surprising to others and myself in the committees working on this. True, this is a complex subject and touches upon difficult subjects such a management of international resources, competition policy and domain name/intellectual property disputes. But how and when did we make the leap to the grand sounding Internet governance? Even in the U.S. government's recently released "White Paper" on domain name system administration, it uses the grandiose term "Internet governance". The White Paper "policy statement" is a classic study in ambiguity. As all graduate literature students know, the well-known authority on ambiguity is William Empson, a British literary critic who wrote a very popular book in 1930 called the "Seven Types of Ambiguity". He defined ambiguity as "any verbal nuance, however slight, which gives room for alternative reactions to the same piece of language". Much of the White Paper is so ambiguous that the reader has no choice but to invent his or her own meanings. And this allows all parties to believe that their particular views have been endorsed which may be politically astute but progress always requires moving from platitudes to the specific and there is no reason to believe that any more consensus will emerge than in the past. There are hundreds of tough decisions to make that the White Paper punts to a new "non-profit" corporation Board of Directors. Today's politically correct mantra is that the private sector should lead. But without details, we're not sure what this says. What does "private sector" mean? Isn't the current administrator of the Internet generic top level domains from the private sector? So what's the problem? The problem is that they, like any company in control of a valuable global resource, will obviously try to maximize profits for their shareholders. Public interest issues, what a civil society normally invests in governments to protect, are missing. Harvard Professor Lawrence Lessig argues in his insightful essay "Governance"[1], how infectious and politically correct is the idea that no government bodies, whether national or international, should have a role to play in regulating cyberspace. Remarking on the U.S. government proposal to create a non-profit U.S. corporation to set global policy for domain names, Lessig notes "We have lost the idea that ordinary government might work, and so deep is this thought that even the government doesn't consider the idea that government might have a role in governing cyberspace." But isn't this a paradox? That the birthplace of the Internet and the self-professed champion of democracy is promulgating its own disillusionment with the applicability of its own democratic processes for the Internet? Lessig concludes his essay with "In a critical sense, we are not democrats anymore. Cyberspace has shown us this, and it should push us to figure out why". So what are we? Ironically, the principles of democratic ideas are so ingrained in our collective beliefs that we're convinced that this is the best way to govern cyberspace. Everyday we read calls for a new widespread net democracy with voting by stake holders (whoever that is). But is this really want we want? Why is it that one of the most successful paradigms of the post-industrial age, the Internet Engineering Task Force, avoids voting like the plague? And wasn't the Communications Decency Act passed virtually unanimously by popular vote in the U.S. Congress but Netizens everywhere rejoiced when it was overturned by the Supreme Court? Do we really want direct democracy for Internet governance? And if we do, in a world of private sector rule, where are the checks and balances that modern democracies have? You may have noticed that I have become a profound cynic about private-sector self-governance. Two years ago this wasn't true but after watching the self-interest of the private sector during the last two years, I've changed my mind. This is not reflective of some dark desire to regulate the Internet it is just recognition of the reality of commercial forces. I'm reminded of the great liberal philosopher Adam Smith, who, more than two hundred years ago, said public monopolies are terrible. They are slow, bureaucratic, inefficient and so on. But he also added, private monopolies are all of this, and in addition, greedy. The bottom line is that the success of the Internet is a Pyretic victory. It has now become far too successful to be treated any different than the rest of society and the economy. The price of success is all the baggage and political correctness which has been hated by the Internet engineering community for so many years. The fact that the debates now have turned to Internet governance instead of the relatively arcane topic of domain name administration says a lot our focus has changed to making sure that all the sacred cows are stroked and that they feel that their views are part of the process even if we get to exactly the same results. While this may eventually lead to progress, it will most certainly be a slow, bureaucratic, and inefficient progress and one that has very little resemblance to what made the Internet what it is today. ----- [1] http://cyber.harvard.edu/works/lessig/Ny_q_d1.pdf ----------- * Based on talk given at INET 98, Geneva, Switzerland, July 22, 1998. ** Advisor, Global Information Infrastructure, International Telecommunication Union, Geneva, Switzerland. The views expressed in this paper are those of the author and do not necessarily reflect the views of the ITU or its membership. ----------------------------------------------------------------------- [14] DNS: A Short History and a Short Future by Ted Byfield tbyfield@panix.com [Editor's Note: In the following article, Ted Byfield examines the problem of domain naming in terms of the lessons from the experience of the telephone. His article presents the kind of broader perspective that needs to be considered in trying to solve the problems raised by the domain name system in the past few years.] [Author's Note: This essay was first published on Rewired during the week of 28 Sept 1998 under the title "A Higher Level of Abstraction"; I've slightly amended it for redistribution on nettime. Thanks to David Hudson for his excellent edit. TB] In the debates that have erupted over domain name system (DNS) policy, two main proposals have come to the fore: a conservative option to add a handful of new generic top-level domains (gTLDs: ".nom" for names, ".firm" for firms, etc.) administered by a minimal number of registrars, and a more radical proposal to level the hierarchical structure of domain names altogether by permitting openly constructed names ("whatever.i.want") administered by an open number of registrars. The supposed cause for these debates orbit around perceived limitations on the system, monopolization of registration by NSI (in the U.S., of course) and a scarcity of available names; as such, the debates gravitate toward modernizing the system and preparing it for the future. What little attention has been paid to the past has focused on the immediate past, namely, the institutional origins of the present situation. Little or no attention has been paid to the prehistory of the basic problem at hand: how we map the "humanized" names of DNS to "machinic" numbers of the underlying IP address system. In fact, this isn't the first time that questions about how telecom infrastructures should handle text-to-number mappings have arisen. And it won't be the last time, either; on the contrary, the current debates are just a phase in a pas de deux between engineers and marketers that has spanned most of this century. A bit of history: From the 1920s through the mid 1950s, the U.S. telephone system relied on local exchange telephone numbers of between two and five digits. As these exchanges were interconnected locally, they came to be differentiated by an "exchange name" based on their location. These names, two-letter location designations, made use of the lettering on telephone keypads: thus an 86x-exchange, for example, might be "TOwnsend," "UNion," "UNiversity," or "VOlunteer." Phone numbers such as "UNion 567" were the norm; "86567" -- the same thing -- would have been seemed confusing, in much the same way that foreign dialing conventions can be. There wasn't a precedent for a purely numerical public addressing system, and, with perfectly good name-and-number models like street addresses in use for centuries, no one saw any reason to invent one. However, as exchanges became interconnected across the nation, AT&T/Bell found a number of problems -- among them, that switchboard operators sometimes had difficulty with accents and peculiar local names. As a result, the national carriers began to recommend standardized exchange names, according to a curious combination of specific and generic criteria: they chose words that resisted regional inflection but were common enough to peg to "local" landmarks. The numbers 5, 7, and 9 were reserved because the keys have no vowels, making it (so the theory goes) more difficult to form words from them; hence artifacts like the fictional prefix 555, so common in old movies, later became the national standard for prefix in fact, in the form of directory assistance. By the late 1950s, when direct long-distance dialing became possible, then popular, variable length phone numbers became a problem for the national carriers, which demanded yet more standardization seven-digit phone numbers in a "two-letter five-number" (2L5N) format. And while it wasn't an immediate problem, the prospect of international telephonic integration -- with countries that used different letter-to- number schemes or even none at all -- drove yet another push for standardization, this time for an "all-number calling" (ANC) system. Amazingly, the transition to ANC in the U.S. took almost thirty years, up to around 1980 depending on the region. (Just as certain telecom- under-served areas are now installing pure digital infrastructures while heavily developed urban areas face complex digital-analog integration problems, phone-saturated urban areas such as New York were among the last to complete the conversion to ANC.) Direct long-distance dialing wasn't merely a way for friends and family to keep in touch: it allowed businesses to deal in "real time" with distant markets. And the convention of spelling out numbers, only partially suppressed, hence fresh in the minds of the many, became an opportunity. Businesses began to play with physical legacy of lettered keypads and cultural habits by using number-to-letter conversions as a marketing tool -- by advertising mnemonic phone numbers such as "TOOLBOX." And as long-distance calls became a more normal for people to communicate, tolls began to fall, in a vicious -- or virtuous, if you prefer -- circle, thereby lowering the cost of transaction for businesses and spurring their interest in broader markets. However, direct long-distance dialing presented a new problem, namely the cost of long-distance calls, which became the next marketing issue -- and toll-free direct long-distance dialing was introduced. The marketing game replayed itself, first for the 800-exchange (and again more recently for the 888-exchange). As these number spaces became saturated with mnemonic name-numbers, businesses began to promote spelled-out phone numbers that were *longer* than the functional seven digits (1-800-MATTRESS) -- because the excess digits had no effect. The game has played itself out in other ways and other levels -- for example, when PBX system manufacturers adopted keypad lettering as an interface for interactive directories which use the first two or three "letters" of an employee's name. Obviously, this capsule history isn't in a literal allegory for the way DNS has developed -- that's not the point at all. There are "parallels," if you like; questions of localized and systematic naming conventions, of national/international integration, of arbitrarily reserved "spaces," of integrating new telecom systems with installed infrastructures, of technical standards coopted by marketing techniques, and so on. But implicit in the idea of a "parallel" is the assumption that the periods in question are separate or distinct; instead, one could -- and should, I think -- see them as *continuous* or cumulative phases in an evolving effort to define viable standards for the interfaces between "machinic" numerical addressing systems and human linguistic systems. Either way, though, DNS -- like the previous efforts -- won't be the last, regardless of how it is or isn't modified in the next few years. This isn't to dismiss the current DNS policy debates. On the contrary, they bear on very basic questions that should be addressed *precisely because their implications aren't clear* -- questions about national/international jurisdiction and cooperation, centralized and distributed authorities, the (il)legitimacy of de facto monopolies, and so on. Ultimately, though, these questions are endemic to distributed- network communications and are *not* unique to DNS issues. What *is* unique to DNS isn't any peculiar quality but, rather, its historical position as the first "universal" addressing system -- that is, a naming convention called upon (by conflicting interests) to integrate not just geographical references at every scale (from the nation to the apartment building) but also commercial language of every type (company names, trademarks, jingles, acronyms, services, commodities), proper names (groups, individuals), historical references (famous battles, movements, books, songs), hobbies and interests, categories and standards (concepts, specifications, proposals) ... the list goes on and on. The present DNS debates center mostly around the question of whether and how DNS should be adapted to the ways we handle language in these other spheres, in particular, "intellectual property." Given the sorry state of that field -- which is dominated by massive industrial pushes to extend proprietary claims indefinitely, to criminalize infractions against those claims, and to weaken "consumer" protections by transforming commodities purchases into revocable and heavily qualified use-licenses -- it's fair to ask whether it's wise to conform such an allegedly important system as DNS to that morass. What's remarkable is how quickly this has evolved, from a system almost fanatically insistent on shared resources and collaborative ethics to a speculative, exclusionary free-for-all. A little more history: With the erratic transformation of the "acceptable use policies" (AUPs) of the various institutional and backbones supporters of the Internet in the first half of this decade, commercial use of the net expanded from a strictly limited regime (for example, NSFNET's June 1992 "general principle" allows "research arms of for-profit firms when engaged in open scholarly communication and research") to an almost-anything-goes policy left to private Internet providers to articulate and enforce (along with questions of spam, Usenet forgeries, and so on and so forth). The result was that any entity that couldn't establish educational, governmental, or military credentials was categorized as "commercial" by default. The ".com" gTLD quickly became the dumping ground for just about everything: not just business names and acronyms, but product and service names (tide.com, help.com), people's names (lindatripp.com), ideas and categories (rationality.com, diarrhea.com), parodies and jokes (whitehouse.com, tragic.com), and everything else (iloveyou.com, godhatesfags.com). (This essay omits discussion of the more nebulous ".net" and ".org" gTLDs -- which are vaguely defined and became popular only after the domain-name debates -- as well as of state [".ny"] and national [".uk",".jp"] gTLDs.) Thus, the "commercialization" of the net took place on two levels: in the legendary rush of business to exploit the net, obviously, but also in the administrative bias against noninstitutional use of the net. There were practical reasons for that trend, to be sure: individual or "retail" access was initiated by commercial Internet providers, which doled out many more dial-up user accounts than domains, as well as technical issues ranging from telecom pricing schedules to software for consumer-level computers that discouraged the casual use of domains. But the trend also had an ideological aspect: the entities that governed DNS preferred the status quo to basic reforms -- and, in doing so, relegated the net's fast diversification to a single gTLD that became less coherent even as it became the predominant force. One can't fault the administrators for failing to foresee the explosion of the net; and their responses are, if not justified, at least understandable. DNS was built around the structurally conservative assumptions of a particular social stratum: government agencies, the military, universities, and their hybrid organizations -- in other words, hierarchical institutions subject to little or no competition. These assumptions were built into DNS in theory, and they guide domain-name policy in practice to this day -- even though the commercialization of the net has turned many if not most of these assumptions upside down. Not only are the newer "commercial" players prolific by nature, but most of their basic assumptions and methods are very much at odds with the idealized cooperative norms that supposedly marked governmental and educational institutions: they come and go like mayflies, they operate under the assumption that they'll be besieged by competitors at any moment, they thrive on imitation, and they succeed (or at least try) by abstracting everything and laying exclusionary claim to every thing abstract -- procedures, mechanisms, names, ideas, and so on. The various systems and fields we call "the market" worked this way before the net came along; small wonder that they should work this way when presented with a "new world." If no one anticipated the speed with which business would take to this new medium, even less could anyone have predicted how it would exploit and overturn the parsimonious principles that dominated the net. Newer domain users quickly broke with the convention of subdividing a single domain into descriptively named sub- and sub-sub-domains that mirrored their institution's structure (e.g., function.dept.school.edu). Instead, commercial players started to strip-mine name space with the same comical insistence that led them to label every incremental change to a commodity "revolutionary." The efficient logic of multiple users within one domain was replaced with a speculative logic in which a few users became the masters of as many domains as they could see spending the money to register. In some cases, these were companies trying to extort attention and money out of "consumers" (business's preferred name for "person"); in other cases, they were "domain-name prospectors" hoping to extort money out of business; in many more cases, though, they were simply "early adopters" experimenting with the fringes of a new field. In effect, the potentially complex topology of a multilevel name space was reduced mostly through myopic greed and distorted rhetoric to a flatland as superficial as the printed pages and TV screens through which the business world surveys its prey. The minds that collectively composed "mindshare," it was assumed, couldn't possibly grok something as complicated as a host name. So, for example, when Procter and Gamble decided to apply "brand management" advertising theories to the net, it registered diarrhea.com rather than simply incorporating diarrhea.pg.com into its network addressing. And so did the ubiquitous competition, including the prospectors who set about registering every commercial domain they could cook up. The follies of this failed logic are everywhere evident on the net: thousands of default "under-construction" pages for domain names whose "owners" renters hoping to become rentiers wait in vain for someone to buy their swampland: graveyard.com, casual.com, newsbrief.com, cathedral.com, lipgloss.com, and so on, and so on. Under the circumstances -- that is, thousands of registered domain names waiting to be bought out -- claims that existing gTLD policies have resulted in a scarcity of domain names are doubtful. In fact, within the ".com" gTLD alone, the number of domain names registered to date is a barely expressible fraction of possible domain names, such as "6gj-ud8kl.com": ~2.99e+34 possible domain names *within ".com" alone*, or ~4.99e24 domains for every person on the planet; if these were used efficiently -- that is, elaborated with subdomains and hostnames such as "6b3-udh.6gj-ud8kl.com" -- the number becomes effectively infinite. Obviously, then, the "scarcity" of domain name is *not* a function of domain name architecture *or* administration at all. It stems, rather, from the commercial desire to match domain names with names used in everyday life -- in particular, names used for marketing purposes. To be sure, "6gj-ud8kl.com" isn't an especially convenient domain name; but, then again, was "Union 567" or "+1-212-674-9850" a convenient phone number, "187 Lafayette St #5B New York NY 10013" a convenient address, or "280-74-513x" a convenient Social Security number? But if DNS is in fact such an important issue, does it really make sense to articulate its logic according to the "needs" of marketers? After all, business has managed to survive the tragic hardship of arbitrary telephone numbers for decades and arbitrary street addresses for centuries. Surely, if the net really will revolutionize commerce, to the point of "threatening the nation-state" as some like to claim, the inconvenience of arbitrary domain names will hardly stop the revolution. *Of course* there are territorial squabbles over claims to names and phrases. And *of course* some people and organizations profit from the situation. But we don't generally erect a stadium in areas where gang fights break out; so one really has to ask whether it's a good idea to restructure gTLD architecture -- supposedly the system that will determine the future of the net, hence a great deal of human communication -- to cater to a kind of business dispute that's in no way limited to DNS. Ultimately, it doesn't really matter which proposed gTLD policy reform prevails, because the gains will be mostly symbolic, not practical -- except, of course, for the would-be registrars, for whom these new territories could be quite profitable. At minimum, adding new gTLDs such as ".firm", ".nom", and ".stor" will bring about a few openings -- and, more to the point, a new round of territorial expansions, complete with redundant registrations, intellectual- property lawsuits, etc. At maximum, an open domain-name space that allows domains such as "what-ever.i.want" will precipitate a domain- grabbing free-for-all that will make navigating domains as unpredictable as navigating file structures. Moreover -- and *much* worse -- where commercial litigation is now limited to registered domain names, an open namespace would invite attacks on the use of terms *anywhere* in an address. Put simply: where apple.material.net and sun.material.net are now invulnerable to litigation, in an open namespace Apple Computers and Sun Microsystems could easily challenge "you.are.the.apple.of.my.eye" and "who.loves.the.sun". Neither proposed reform *necessarily* serves anything resembling a common good. But both proposed reforms will provide businesses with more grist for their intellectual property mills and provide users with the benefits of, basically, vanity license plates. The net result will be one more step in the gradual conversion of language -- a common resource by definition -- into a condominium colonized by businesses driven by dreams of renting, leasing, and licensing it to "users." It doesn't, however, follow that the status quo makes sense -- it doesn't. It's rife with conceptual flaws and plagued by practical issues affecting almost every aspect of DNS governance -- in particular, who is qualified to do it, how their operations can be distributed, and how democratized jurisdictions can be integrated without drifting being absorbed by the swelling ranks of global bureaucracies. The present administration's caution in approaching gTLD policy is an instinctive argument made by people happy to exploit, however informally, the *superabundance* of domain-name registrations. Without doubt, the main instabilities any moderate gTLD policy reform introduced would be felt in the administrative institutions' funding patterns and revenues. More radical reforms involving more registrars would presumably have more radical consequences, among them, a need to certify registrars and DNS records, from which organizations with strong links to security and intelligence agencies (Network Associates, VeriSign, and SAIC) will surely benefit. The current administration insists that an open name space would introduce dangerous instabilities into the operations of the net. But whether those effects would be more extreme than the cumulative impact of everyday problems -- wayward backhoes, network instabilities, lazy "Netiquette" enforcement, and human error -- is doubtful. There is one point on which the status quo *and* its critics agree: the assumption that DNS will remain a fundamental navigational interface of the net. But it need not and will not: already, with organizations (ml.org, pobox.com), proprietary protocols (Hotline), client and proxy-server networks (distributed.net), and search-engine portal advances (RealNames, bounce.to), we're beginning to see the first signs of name-based navigational systems that complement or circumvent domain names. And they're doing it in ways that address not the bogeys that appear in the nightmares of rapacious businessmen but the real problems and possibilities that many, many more users are beginning to face: maintaining stable e-mail addresses in unstable access markets, maintaining recognizable Zine-like servers in the changing conditions of dynamic IP subnets, cooperating under unpredictable load conditions, and, of course, *finding* relevant info -- not *offering* it, from a business perspective, but *finding* it from a user's perspective. DNS, as noted, was built around the assumptions of a specific social stratum. Prior to the commercialization of the net, most users were if not computer professionals then at least technically proficient; and the materials they produced were by and large stored in logical places which were systematically organized and maintained. In short, the net was a small and elite town, of sorts, whose denizens -- "Netizens" -- were at least passingly familiar with the principles and practices of functional design. In that context, just as multiple users on a single host was a sensible norm, so were notions of standardized file structures, naming conventions, procedures and formats, and so on. But just as the model of multiple users on a single host has become less certain, so has the rest. The net has become a non-systematic distributed repository used by more and more technically incompetent users for whom wider bandwidth is the solution to dysfunctional design and proliferating competitive formats and standards. Finding salient "information" (the very idea of which has changed as dramatically as anything else) has become a completely different process than it once was. This turn of events should come as no surprise. As commercial domains multiplied, and as users multiplied on these domains, the quantities of material their efforts and interactions produced grew ferociously -- but with none of the clarity typical the "old" institutional net. In the past, the information generated around or available through a domain (or to the subdomains and hostnames assigned to a department in a university or military contractor) was often "coherent" or interrelated. But that can't be said of the material proliferating in the net's fastest-growing segments: commercial Internet access providers, institutions that automatically assign Internet access to everyone, diversified companies, and any other domain-holding entities that permit discretionary traffic. Instead, what one finds within these domains is mostly random both in orientation and in scale: family snapshots side by side with meticulously maintained databases, amateur erotic writings next to source-code repositories, hypertext archives from chatty mailing lists beside methodical treatises, and so on. In such an environment, a domain name functions more and more as an arbitrary marker, less and less as a meaningful or descriptive rubric. This isn't to say that domain names will somehow "go away"; on the contrary, it's hard to imagine how the net could continue to function without this essential service. But the fact that it will persist doesn't mean that it will serve as a primary interface for navigating networked resources; after all, other aspects of network addressing have become all but invisible to most users (IP addresses and port numbers to name the most obvious). The benefit that DNS offers is its "higher level of abstraction" a stable addressing layer that permits more reliable communications across networks where changing IP numbers change and heterogeneous hardware/software configurations are the norm. But "higher" is a relative term: as the substance of the net changes as what's communicated is transformed both in kind and in degree, and as the technical proficiency of its users drops while their number explodes DNS's level of abstraction is sinking relative to its surroundings. --------------------------------------------------------------------- [15] ARPANET Mailing List and Usenet Newsgroups Creating an Open and Scientific Process for Technology Development and Diffusion by Ronda Hauben ronda@panix.com [Editor's Note: Following is the first installment of a longer article about the importance of MsgGroup mailing list and the kinds of lessons it can provide toward determining how to solve the problems of scaling the Internet.] Introduction In an article in the journal "The Information Society", Luciano Floridi from Wolfson College at Oxford, notes the importance of the Internet and how it has generated an excitement and promise for the future. Floridi writes: [L]ast year the Internet finally appeared to the general public as the most revolutionary phenomenon since the invention of telephones, though in this case Time missed the opportunity to elect the Internet 'Man of the Year.'(1) Floridi, contrasts the significance of the new development represented by the Internet with the relative lack of scholarly study and knowledge about its development: A whole population of several million people interacts by means of the global network. It is the most educated intellectual community that ever appeared on earth, global academy that, like a unique Leibnizian mind, thinks always. The Internet is a completely new world, about which we seem to know very little . . . [I]ts appearance has found most of us, and especially the intellectual community, thoroughly unprepared. However, to "know" something it is helpful to look at its early development, as that is when its form and principles are most clearly articulated. The foundation for the Internet was set by the development of the ARPANET (b. 1969) and Usenet (b. 1979), which were connected to each other in the early 1980s. This paper will examine some of the early computer conferencing research work to link those on different computers or using different operating systems on the ARPANET and then on Usenet. It will explore how the foundation was set to promote computer facilitated communication, which was some of the scientific and collaborative work which made the Internet possible. There will be an effort to quote early pioneers when possible to give an indication of the process as well as the result of their work. Part I Support for a Scientific Methodology Writing in the 1960s, the German philosopher Jurgen Habermas described a scientific methodology developed by the U.S. Air Force to solve difficult technological problems. He outlines the process of communication established between those contractors who would work on a problem and the Air Force personnel involved. They placed importance on communication to identify the precise nature of the problem, and then the combining of practice and theory to develop a methodology to solve the problem.(2) A similar kind of collaborative communication process was developed via the early mailing list MsgGroup on the ARPANET and this process helped to make it possible to develop and expand the ARPANET into the Internet. ARPA and the ARPANET When the Soviet Union launched Sputnik I, the world's first artificial satellite on October 4, 1957, it took the world by surprise. In the U.S., President Eisenhower summoned scientists to provide advice to the White House on how to advance U.S. science and technical developments. Believing that the competition within the U.S. Department of Defense (DOD) was a problem that had to be solved if the U.S. was to advance in its ability to do forefront scientific and technological development, Secretary of Defense Neil McElroy created a new agency, apart from the three existing branches of the services. This new agency, the Advanced Research and Projects Agency (ARPA) was to provide support for advanced space research. By the early 1960's, ARPA recognized the need to expand its scope, and J.C.R. Licklider was brought in to head a new office that would take on research in computer science. Licklider served as the first head of the Information Processing Techniques Office (IPTO) at ARPA from 1962 to 1964. The earliest work of the IPTO was to fund research in the time-sharing of computers, to make interactive computing available in a way not possible with the batch operated computers common at the time.(3) By the late 1960s however, time sharing of computers had developed and there were different computer time sharing systems around the U.S. Those at ARPA began to envision a linking up of these different systems so that the resources could be shared and so those using different computer hardware and software would be able to communicate with each other.(4) Also, the work of pioneers like Paul Baran at RAND in the U.S. and Donald W. Davies working in the United Kingdom, indicated that a more economical form of data transmission, i.e. packet switching, would provide an appropriate technology for data transmission. Recognizing the need to do research in creating a computer data network that would make it possible to share resources among researchers doing work on different hardware and software platforms, a contract was awarded to BBN to begin the construction of a sub-network that would connect various ARPA contractors at universities and other sites with ARPA contracts. The new network became known as the ARPANET. Those connected to the ARPANET grew rapidly and by the mid 1970s there was the recognition that a new form of communication had developed on the ARPANET called electronic mail or more commonly, e-mail. MsgGroup Begins In a message submitted to the MsgGroup mailing list dated June 7 1975, Steve Walker, of ARPA (IPTO) and Net Manager of the ARPANET(5) describes a proposal for communication research on the early ARPANET. He writes that he is "seeking to establish a group of people concerned with message processing" in order to "develop a sense of what is mandatory, what is nice, and what is not desirable." He notes, "We have a lot of experience with lots of services and should be able to collect our thoughts on the matter." The methodology he proposes, however, is of particular importance. He is encouraging the creation of a new form of computer conferencing to be developed on the early ARPANET. "My goal," he writes, "at present is not to establish 'another committee' but to see if dialogue can develop over the net." He notes that there is probably something less formal already occurring, but he wants to broaden it to be able to include more of those who could make a contribution. Participation will be encouraged, but it is voluntary. "I do not wish to force anyone to participate," he explains, "but I strongly urge anyone with comments (positive or negative) to toss them in." Also, the form of participation was to be open ended, rather than requiring particular kinds of contribution. "While supporting philosophical discussions," he writes, "I like very much the specifics of evaluation. Can we try to do this," he asks, promising that "the results may surprise many of us." He requests that the participants "encourage a FORUM-type set up if it's not too difficult to set up, realizing that many (myself included) will have little time to contribute." Though he recognizes that such sporadic participation may be thought to fragment the group, he proposed they should be made and will prove to be a contribution. "I've asked Dave Farber to maintain a list of Message Group participants," he continues, noting that Dave Farber, then on the faculty at the University of California Irvine, a participant on the ARPANET, would help facilitate participation in the online forum Walker was proposing. Extending his invitation to newcomers to be full participants without feeling they have to gather any particular background, he explains, "those who don't wish to have their message files filled with possible 'junk mail' should feel free to withdraw." But he expresses the hope that it will be possible "from all this to develop a long term strategy for where message services should go on the ARPANET and indeed in the DOD." And Walker ends his message by encouraging participation, "Let's have at it." The mid 1970s was a period of change in developing the usefulness of computer mail on the ARPANET. Previous to 1975, the creation of programs making e-mail possible on the ARPANET was more of an informal undertaking, according to a study of ARPANET e-mail posted to MsgGroup by Raymond R. Panko(6). Panko notes the earliest work in developing e-mail capabilities grew up on the earliest time sharing systems funded by ARPA in the early 1960s. "But the value of computer mail had become obvious to ARPA by the beginning of 1975." He writes how ARPA, like a number of other organizations, had begun to use computer mail for its bread and butter communication and had become aware that a relatively mature communication medium was becoming available. It was against this background of increasing interest by ARPA in e-mail that Steve Walker issued the invitation to take part in an online conference to develop a computer conferencing system. Farber responded to Walker's invitation, "I too second the motion of Steve to Let's have at it."(7) Farber promised to maintain a file of correspondence for those who participate in case they miss any of the messages or do "not feel like making like a file clerk." Those involved agreed to accept the challenge of exploring how to create a network conferencing system using ARPANET communication. In considering the difficulties of using such technology during this period in the mid 1970s, David Crocker, at the University of Southern California presented his evaluation of three possible programs that those on MsgGroup could use to form their online conference. One of the programs was FORUM, a conferencing system developed under DOD funding. Crocker explains that this conferencing system "has a long start-up curve and requires that all participants have access to the same machine."(7) Another proposed conferencing program TCTalk, Crocker notes, "requires that all have operating access to the operating system Tenex," which was one of the operating systems used by some of those on the ARPANET.(8) Since those on the ARPANET were using a variety of different computers and several different operating systems, Crocker believed that neither a program dependent upon a single type of computer nor one requiring a particular operating system would be appropriate. Instead he explained that there was a program being used to send e-mail on the ARPANET (i.e. Net Mail) that was already being used by those on the ARPANET and it made communication between users with diverse computer systems and operating systems possible. Crocker also noted some of the other advantages of Net Mail. He wrote(9): Use of Net Mail a) is extremely convenient for most, if not all, of us, since we already exercise it for other activities; b) allows passive observation of the dialogue, rather than forcing everyone to explicitly catch up on recent comments . . . .; c) mail is easily deleted and so "junk" mail is not really a serious problem. Most, if not all of us, have mail reading systems which allow a "menu" review of mail, prior to reading the contents. Proposing that Net Mail will best satisfy the aims of the research, he writes: "I have spent the better part of this spring looking at our teleconferencing capabilities (as part of a seminar ) and as a result, suggest we continue to use Network mail as our communication tool, rather than using TCTALK or FORUM." Listing the participants in MsgGroup at this early period and the sites where they have their computer accounts(10), Farber identifies Burchfield, Myer and Gilbert from Bolt Beranek and Newman, the Cambridge, MA contractor who created the IMP subnetwork for the ARPANET. He lists Tasker, McLinden, Walker, Farber, Stefferud, Ellis, Kirstein, Iseli, Dave Crocker, and Paul Baran at ISI at the University of Southern California. At OFFICE 1, he lists Uhlig and Watson, at MIT-DMS, Vezza, and at Harvard-10, Mealy. In a message noting the promising potential of this new form of computer networking communication, another early MsgGroup participant(11), Tasker writes, "Sitting here in the offices of a potential military user I am extremely gratified and excited to see the msg group interacting and that those interactions appear to be converging around real capabilities that I think can be sold to the operational military guy. A scant three or four months ago I never would have even hoped for the current state of affairs and the direction it indicates." In a similar vein, Ron Uhlig at OFFICE 1 expressed his enthusiastic support for MsgGroup. Describing the informal project he was working on for the Army Materiel Command (AMC), he wrote(12): For those of you unfamiliar with our "experiment" in Army Materiel Command, we have been using OFFICE 1 for communication among seven of the key managers in data processing in Army Materiel Command (AMC)... In general, we have had the same kind of experience in improved communication that ARPA had when they began using a message system on the network. Continuing major cuts in the Army Materiel Command work force plus some fairly major reorganizations which are now being planned are leading us to give serious consideration to adopting an on-line computer based message system for key managers throughout the command. We are in the early stages of trying to define what such a system needs to look like.... Since we are aiming more at the informal communications we are not terribly concerned with the DOD traditions... Our primary concern is that the message system be easily usable by non-computer science people, some of whom are actively hostile to computers in general. The demonstrations that we have given to various non-computer science, nontechnical personnel around AMC have generally been well received. But one must know far too much 'computerese' to use any of the existing systems." Elaborating on the need for online conferencing, he writes: We have a strong need for teleconferencing because our key managers are greatly dispersed geographically. The message system that we eventually adopt needs a teleconference capability. We don't want message handling and teleconferencing to be in two separate systems. Because of this we also want to make it easy in the middle of a message based teleconference to link to a databank somewhere in AMC to pick up information which is needed at that point in time. An FTP type capability, simple to use for the novice, would meet the need very nicely. Concluding his comments, he promises continued feedback: As we get better definition on our requirements during the next few months I will put additional messages into the network to keep you all current on our thinking. This message is only intended to be introductory.(10) A subsequent message by Crocker suggested they ignore authentication issues, which like other security issues, were considered secondary and were avoided for the time being.(13). Given the current state of network/system/mail security, I suggest we ignore authentication issues. Summarizing the progress made in the first month since the beginning of the new form of network communication, Steve Walker writes(14): The MsgGroup was formed by a group of interested people commenting on how message services should appear to users (as opposed to how they should function internally.) I'm pleased with the progress of this 'conference'. I am trying to arrange for Stefferud to serve as a 'paid' organizer so that the group's ramblings can come out in a coherent form. I would encourage your continued participation here and in groups such as Dave Farber's Compcom get together. Part II Vision of New Form of Computer Communication Documenting the success of the work done by those on MsgGroup and subsequent ARPANET mailing lists, a report prepared for a technical conference in 1979 by several MsgGroup participants, observed that there had been important advances in e-mail and conferencing capabilities.(15) The report explained how these achievements are not only a natural out growth of technological advances, but also the result of the convergence of communication and computers. "In various current networks of computers," they write, "large numbers (thousands) of individuals and agencies are able to communicate among themselves via message exchange using many different computers and terminals in the process." This was not an easy feat to achieve. Their report notes the value to people who have access to these computer message services (CMS). They write(16): Those who have access will be able to communicate through the CMS facilities with others who have access as the number of connected individuals and agencies grows, the value of being connected will grow. The key source of value lies in the range of easily addressable potential communication. In the development of MsgGroup conferencing efforts, several describe the unique capabilities that a mailing list like MsgGroup has made available to those participating. For example, in a post, Pickers(17) describes how a mailing list creates a participatory process that is superior to what traditional meetings could make possible. He writes: Unlike normal conferences, where there are limited microphones, a chairperson and where audience energy tends to wear down, MsgGroup style conferencing never resolves issues much less adjourns. This effect follows naturally from the observation that every participant reenters the discussion by choice, perhaps following a recuperative and regenerative period of rest. Others on MsgGroup consider the problem of emotional messages (also known as flaming). However, Gaines, in a post(18), proposes that such problems are secondary and should be recognized as "the price we have to pay for an open discussion group where people are free to voice their ideas. We must expect that this whole process produces a fair amount of nonsense." Most importantly, however, he points out: We are feeling our way in a murky area, and have to expect to make mistakes. Let us judge the MsgGroup by the good ideas that surface which by the nature of the area have to be expected to be few and far between but worth the overhead of the other traffic when they arrive. Emphasizing the unique nature of the contributions to MsgGroup, Charles Frankston with a login at MIT, warned that analogies between electronic mail and telephone and paper communications must be made very carefully. Electronic mail, he writes(19), "is a new medium and it may not necessarily make sense to use it in the same fashion as existing medium, any more than it would have made sense to use telephones in precisely the same fashion as telegraphs that preceded them." Observing that "electronic mail is currently used extensively for communications which today does go to many recipients," he cites interoffice memos as an example. "As a new medium I also claim electronic mail has generated new uses not heretofore possible. For example, most of my use of the medium consists of back and forth technical discussions, often among persons widely dispersed geographically. In fact, the great advantage of electronic mail for this sort of use, is that it is easy to simply cc anyone I think might be interested or have information to provide on the current topic." Another report, titled "The Convergence of Computing and Telecommunications System," by Dave Farber and packet switching pioneer Paul Baran, was posted to MsgGroup(20). Farber and Baran were able to collaborate to write the report via the ARPANET despite the fact they lived in geographically different regions of the U.S. In the report, they wrote that "A major change in computer communication is taking place. Tomorrow, computer communication systems will be the rule for remote collaboration." Problems and Benefits In their report, Farber and Baran observed that the falling costs of computing would lead to a situation where certain industries and institutions would feel threatened by the "prospect of obsolescence of their present justification." One such industry they predicted would be publishing. In his study of e-mail, Panko, too, noted a similar barrier to technological development of e-mail and e-mail conferencing. He observed the inability of commercial users to recognize the advantage of e-mail and of the increased communication that e-mail and online conferencing made possible. However, both Panko's study and the report by Farber and Baran emphasized that many others would welcome the new forms of communication that this convergence of computers and communication technology would make possible. Panko pointed to the promising development represented by the 15 million people involved with CB radio in the U.S., out of a possible 70 million households. This promised that a warm welcome would greet the increased ability for communication to be made available via e-mail and e-mail conferencing. Social Issues Become Important Panko documented how government funding of computer science researchers to solve the problem of computer conferencing communication across different computers and different operating systems had yielded great social and technical benefits. He wrote (21): "Historically, computer media were first extensively developed on the ARPANET. Anyone familiar with the Advanced Research Projects Agency (after whom the ARPANET is named) realizes that ARPA was the dominant funder of leading-edge computing during the 1960's. Essentially, ARPA was funding the community of hobby computerists par excellence. Funding was fat and creativity was given free reign during business hours. Moreover, ARPA contractors found their staffs working long overtime, developing space war games, stock market information services, and as noted above, computer mail systems. In other words, hobby computing at a grand scale was the original source of many advanced mail systems today. Computer mail had a strong hobbyist flavor in its use as well as in its origins. Colleagues in artificial intelligence, database design, and other exotic fields used computer mail to build and maintain their community." "Furthermore," he added, "in applications where computer teleconferencing has been successful, discussion has often been free-wheeling and chatty. The longest conferences tend to be breezy and rambling, yet very successful in exchanging ideas and viewpoints." Thus he noted the great stimulus given to these e-mail developments by the support of government financed programs. In their report, Farber and Baran recognize that social questions would arise as a result of these important new communications developments. And they realized that too little emphasis would be given to examining the social consequences that had to be considered to determine what the future should be for these social developments. For example, the issue of how decisions over the new medium would be made wasn't being given adequate consideration.(22) "Little attention," they wrote, "is paid to the 'public interest.' In part, the term defies definition. Is the public interest the interest of the cross-subsidized residential telephone user? Is it the interest of a business which faces a reduced communications bill? Is the public interest to be viewed primarily in the short term irrespective of long term damage to existing institutions in achieving immediate savings." Summarizing the promise for the future that enhanced communication would hold, Lauren Weinstein wrote(23): The whole point of MsgGroup to me is that we are free to communicate without undue worry about costs, and to borrow a line from the closing episode of the 'Connections' program from PBS, "the easier it is to communicate, the faster change occurs." It is this very change that is creating the systems, concepts and most importantly, the EXPECTATIONS of people for message systems of the future. TO BE CONTINUED ----- Note: The notes corresponding to the numbers in the above article are available from the author via e-mail. _________________________________________________________________ ----------------------------------------------------------------- The opinions expressed in articles are those of their authors and not the opinions of The Amateur Computerist newsletter. The Editors welcome submissions from a spectrum of viewpoints. ----------------------------------------------------------------- EDITORIAL STAFF Ronda Hauben William Rohler Norman O. Thompson Michael Hauben Jay Hauben The Amateur Computerist invites submissions. Send them to: R. Hauben, P.O. BOX 250101, NY, NY, 10025-1531. Articles can be submitted on paper or on IBM disk in ASCII format, or via e-mail. One year subscription (two issues) costs $10.00 (U.S.). Add $2.50 for foreign postage. Make checks payable to J. Hauben. Permission is given to reprint articles from this issue in a non- profit publication provided credit is given, with name of author and source of article cited. ELECTRONIC EDITION AVAILABLE Starting with vol 4, no 2-3, The Amateur Computerist has been available via electronic mail. To obtain a copy, send e-mail to: ronda@panix.com or jrh@ais.org Also, The Amateur Computerist is available via anonymous FTP and on the World Wide Web at: ftp://wuarchive.wustl.edu/doc/misc/acn/ http://www.columbia.edu/~hauben/acn/ http://www.ais.org/~jrh/acn/ _________________________________________________________________ -----------------------------------------------------------------