------------------------------------------------------------------------- | TTTTT H H EEEE | | T H H E | | T HHHH EEE | | T H H E | | T H H EEEE | | | | A M M A TTTTTTT EEEEE U U RRRR | | A A M M M M A A T E U U R R | | A A M M M M A A T EEE U U RRRR | | AAAAA M MM M AAAAA T E U U R R | | A A M M A A T EEEEE UUU R R | | | | CCCC OO MM MM PPP U U TTTTT EEEE RRRR III SSS TTTTT | | C O O M M M P P U U T E R R I S T | | C O O M M M PPPP U U T EEE RRRR I S T | | C O O M M P U U T E R R I S T | | CCCC OO M M P UU T EEEE R R III SSS T | |-------------------------------------------------------------------------| |Spring 2001 Volume 10 No 2| |-------------------------------------------------------------------------| Table of Contents [1] Editorial [2] Is the Internet a Laboratory for Democracy? [3] Ford Model E Program [4] Battle over Computer Classes [5] State of the Net in Hungary [6] A Loss for Netizens: Kerry Miller [7] Moment of Silence for Michael Muuss [8] Usenet Archives: Culture Clash [9] John Locke and the Internet [10] MsgGroup Mailing List ------------------------------------------------------------------- [1] Editorial This issue of the Amateur Computerist returns to a general rather than thematic format. There are a number of articles, however, that explore whether the Internet will be for everybody or whether it will be limited to an exclusive strata of society. Also the question of what role the Internet will play in society is a question that needs public discussion and examination. Such topics are being ignored by the media, at least in the U.S., at the current time. Meanwhile there are plans in the U.S. to change some of the nature of the Internet and the means of its access. While the Internet was originally created to make possible resource sharing of human and computer resources, there are commercial desires to make the Internet into a network that will prioritize packets and introduce classes of service so that the packets of those who pay more will be treated in a privileged way and those who cannot pay more will have their packets treated as second class. The article in this issue about the cancellation of programming classes at the Ford Motor Company that led to the creation of the Amateur Computerist shows that a change in policy can be carried out in a way that is hidden from the public and contrary to their best interests. The effort of the staff of the Amateur Computerist to continue to support the development of computers and computer education, despite losing the classes has been an important achievement. Almost 15 years after the computer programming classes were ended at the Ford Rouge Plant, the Ford Model E program has been introduced and is making it possible for many Ford employees to have computers and a form of Internet access. What will be the long term effect of this program will be interesting to see. The talk "Is the Internet a Laboratory for Democracy?" presented at a European Union Conference in December 1999, describes the important role that the Internet can play in making it possible for citizens to make some impact on the otherwise difficult problems of their societies. Understanding the potential of the Internet and the goals of its early socio-technical pioneers can help to define a path for those concerned with its continued development. The article on the State of the Net in Hungary provides a view of how Internet development is progressing in Hungary and the problems that the Hungarian people are encountering to be able to have access to the Internet. This article helps to understand the challenges to a society trying to develop the Internet and trying to have it serve a general purpose and socially beneficial goal. In a similar way, the challenges of Usenet's development and the effect on Usenet of a company archiving the posts contributed by users is explored in "Culture Clash: The Google Purchase of the 1995-2001 Usenet Archive and the Online Community." In this issue we express sadness with the loss to the Internet and the world of two important Netizens, Michael Muuss and Kerry Miller. The article on John Locke and the Privatization of the Internet considers the importance of thinking about the way that the Internet was originally created and the benefits that a social goal provided for all users. John Locke's writing offers some helpful ways of understanding how the benefits of such a shared development are important to consider and nourish. Serialization of the article describing the early development of the MsgGroup mailing list ends in this issue. Reviewing this early mailing list provides a way to look back at some of the early vision of creating an online collaborative process. This can help provide useful perspective toward understanding the current developments and plans for scaling the Internet. How far have we come and where do we as a society want to go with regard to the future of the Internet? There is a vital need to be raising such questions publicly and hearing from a variety of voices of users about how they perceive the path forward. We hope that volume 10 no 2 of the Amateur Computerist will contribute to catalyzing the much needed public discussion on these issues. --------------------------------------------------------------------- [2] Is the Internet a Laboratory for Democracy? by Ronda Hauben ronda@panix.com [Editor's Note: Following is an edited and expanded version of an invited talk "Is the Internet a Laboratory for Democracy: The Vision of the Netizens or the E-commerce Agenda?" given at the European Union NGO Citizen's Agenda Conference in Tampere, Finland, December 5, 1999. The URL for the conference was http://www.citizen2000.net/E2 ] I am happy to be here today at this EU Conference on Citizens2000 exploring the nature of citizenship at this special time in history when we are about to welcome in not only a new century but also a new millennium. It is interesting that many of the questions being asked at this conference are the questions that show that we have both the old and the new surrounding us and it is not always easy to understand the new as it isn't something that we are familiar with. Yesterday during one of the opening sessions of the conference, the question was raised by both someone in the audience and someone on the panel on stage, about why the voting level of people voting in elections in Europe is low. This is true in the U.S. as well. The session yesterday raised the important need to go beyond representative democracy in political forms available to the citizen in modern times. And the question was asked: "What would be the new ways of participating?" I am delighted to be here today at this conference considering the role of citizens in the coming millennium. This seminar "Civic Participation, Virtual Democracy and the Net" is not only exploring the role of citizenship but also a new form of citizenship, that of the form of citizen which is one of the newly emerging developments brought into the world by the Internet. That of the Netizen. The question I want to raise with my talk is "Is the Internet a laboratory for democracy?" And I hope that we can discuss this question more fully as part of this seminar. Also I want to raise the question of what this shows us about the nature of the Internet and about the new forms of participatory democracy the Internet makes possible. When I first got access to Usenet, online newsgroups that are accessible via the Internet, my earliest posts were greeted with comments from people around the U.S. and from other countries like Scotland and Canada and Australia. I was thrilled with the ability to have a serious discussion on a variety of important issues. This was the situation when I first got access to an e-mail account and Usenet in January 1992 from the Cleveland Free-Net. I had heard that Usenet was a collection of newsgroups filled with all sorts of interesting information, but I didn't know how to contribute to it. I wrote out a description of what I was interested in discussing and sent it to the only online newsgroup forum that I could figure out how to access at the time, which was called misc.books.technical. First Post on Usenet From: au329@freenet.cleveland.edu Newsgroup: misc.books.technical Date: 10 Jan 92 07:48:58 I am interested in discussing the history of economics i.e., Mercantilists, physiocrats, Adam smith, ricardo, marx, marshall, keynes, etc. With the world in such a turmoil it would seem that the science of economics needs to be invigorated. Is there anyplace on Usenet News where this kind of discussion is taking place? Ronda The response to this and my other early posts surprised me. I had posted to the Usenet newsgroup misc.books.technical because this was the only newsgroup I could get access to and I was new using Usenet, using it from the Cleveland Free-Net. Within a day I had 10 e-mails from across the U.S. and a few from abroad. Do you have any idea why? The newsgroup misc.books.technical was for the discussion of technical subjects and I was asking about how to discuss economics. People from around the U.S. and abroad wrote me to tell me that I had posted in the wrong newsgroup. Several of those who wrote told me the newsgroup where I should have posted in was "sci.econ." Others wrote, describing how Usenet worked. Even more surprising was that one person actually wrote, encouraging me to post in the appropriate newsgroup and saying to me: "We're all ears!" I had interested people and they had acted both to tell me what I had done wrong and also to tell me how to be able to make my contribution so that it could be utilized and considered by others. This was an impressive experience for me. Ten people had taken time out from their lives to help me correct a problem, and to make it possible for me to begin to contribute to Usenet and its growing worldwide community of users. A short time later I found another online forum, a "mailing list". Unlike the newsgroups that were forums where I could go to participate, joining a mailing list led me to get messages that came to my mailbox and often would fill my mailbox. This was 1992. The U.S. portion of the Internet, during this time period of 1992-1993 was basically government owned and operated. The mailing list I had joined was called "com-priv". This mailing list was discussing plans for privatizing the U.S. portion of the Internet and making it commercial. On this mailing list I found U.S. government officials from different U.S. government agencies, including the National Science Foundation who were in charge of networking there. There were officers from the newly created Internet Society, and some of the people who had begun to operate or were hoping to soon operate commercial access points who called themselves Internet Service Providers (ISP). When I posted on "com-priv" asking why the U.S. portion of the Internet was being privatized, my posts were either ignored or I would get an e-mail asking me why someone who had just arrived into the discussion would have such strong views on this topic. Here my contributions were discouraged or ignored. I wondered why I could find no discussion about the planned privatization. What were the reasons for it to happen? What were the reasons it might be a problem? And why was no such discussion allowed on the mailing list? I posted on "com-priv" asking a question about the development of the Internet. Also through e-mail I got in contact with some of the pioneers of early Usenet and the Internet. I eventually left the "com-priv"mailing list, but I had begun to realize that I wanted to understand the origins of Usenet and the Internet, and to understand how the interesting participatory environment I was experiencing online on Usenet had developed. My access to Usenet depended on the Internet and I wondered why the U.S. portion of the Internet was being privatized. I soon learned that the pioneering vision of J. C. R. Licklider had inspired many of the earliest networking developments. Licklider was a scientist, a psychologist who had studied the brain to learn how hearing was made possible. Also he had participated in the discussion circles in the Cambridge, Massachusetts area where Norbert Wiener and others discussed the nature and laws governing communication in humans and machines. From this ferment the theory of communication, control and feedback, of cybernetics, was developed. Wiener recognized the importance of determining the nature of the relationship between the human and the computer. Licklider decided to do a study to understand what would be a desirable relationship. As part of Licklider's research study, he wrote down all the tasks he did as part of his research. Reviewing the notes he made, he discovered that a large percentage of his time was spent doing routine tasks that the computer could do, and only a small percentage of his time was spent doing the kind of tasks like thinking about the data he had gathered that the human was uniquely qualified to do. Licklider reasoned that what was needed in the human-computer relationship was a partnership, where each partner, the human partner and the computer partner, would work together doing what they each could do best. This would be the most productive arrangement. It would result in the most desirable rapport. Licklider called this relationship "human-computer symbiosis" and he wrote about his experience in a paper he published in 1960 called "Human-Computer Symbiosis"(1). In the paper, Licklider described how there was a need for human-computer interaction in order to achieve the kind of rapport that he proposed was desirable between the two partners in this new form of symbiotic relationship. Also in the paper, Licklider outlines the kind of research needed to create this interactivity between the human and the computer. This was a time when computers were big machines filling large data processing center rooms. A person wanting to run a program would have to type it on punch cards, and then bring the stack of cards to the data processing center and leave them. The person would come back hours or days later to pick up a printout to see what the program had done and if it had worked. Forgetting a period or a comma would often mean the program had to be resubmitted. Getting the program to run might take several days and numerous trips to the data processing center. The research program that Licklider outlined in his paper was for a new form of computer architecture that would make it possible for a person to interact directly with the computer, to be able to type into the computer oneself, instead of having to bring punch cards to someone else to feed into the computer. Also Licklider's research program included creating interactive graphics. At the time that Licklider was doing his research, there was a realization inside the U.S. Department of Defense that these huge batch processing computers were too hard to use for people to be able to utilize the value of the computer. Licklider was invited to set up a research office inside the civilian scientific research agency called ARPA (Advanced Research Projects Agency) that had been created under the U.S. Secretary of Defense. Licklider started the Information Processing Techniques Office (IPTO). Building on early efforts to determine what methods were needed to support fundamental research, Licklider decided to support the creation of what he called "Centers of Excellence" at chosen universities in the U.S. Licklider supported the creation of Project MAC at MIT and another research program at what is now Carnegie Mellon University in Pittsburgh. These were research programs to study the computer's potential and to explore the human computer relationship. A particular interest of the research was how the computer could be used for more than arithmetic calculations. In particular, Licklider was interested in how the computer could be developed as a communication device. If you remember Licklider was a scientist interested in the nature and mechanisms of human communication both in the research he had done about the nature of the brain and how it made communication possible and in the discussions he was part of in the Wiener circles. As Licklider was helping to set up research centers at universities he felt it would be important to have a network of these different centers to make it possible for the researchers at the different programs to communicate with each other. In this way they would be able to identify what they had in common and what the general nature of the study they were doing was. Licklider called this network of leading researchers, "the intergalactic network". Since he was interested in facilitating communication among the different research projects and researchers, he knew that a goal of computer research would be to create a computer network. The earliest efforts at IPTO to create a computer network didn't succeed. Licklider left IPTO in 1964 after almost 2 years. But the vision he was developing helped to inspire others who became the directors of IPTO to continue to pursue this effort. Research in interactive computing led to the creation of different communities of researchers able to share a computer and interact with it directly and with each other. This new form of computing was called time-sharing. As head of IPTO in 1966, another psychologist, Robert Taylor also recognized the importance of linking different time-sharing systems at different universities. He brought Larry Roberts to ARPA to head IPTO and to create a packet switching network project which came to be called the ARPANET, i.e. a network connecting the different ARPA centers of excellence. Robert Kahn, who had worked on the design of the ARPANET and its development at BBN, came to IPTO in November 1972. Kahn began the internetworking project, the effort to make it possible to share computer resources among those who were on different networks. These resources included people collaborating and communicating, as well as sharing programs and other computer resources. Kahn directed the research from 1972 through the 1980s that made the Internet possible. IPTO was ended in 1986. By 1992 the Internet had developed and spread around the U.S. and Usenet had spread through several countries in Europe and was accessible via the Internet or via uucp. A student at Columbia University in NYC, Michael Hauben, had a project to do for a class he was taking in computers and society. He had only recently gotten access to the Internet as a Columbia student. But he had experience as a teenager on local bulletin board systems (BBS's) in Michigan. He had heard that the Internet was a much more extensive communications system and was interested in knowing how far it reached and what it made it possible for people to do. He wrote a set of questions and posted them on Usenet and on relevant mailing lists. E-mail responses immediately started arriving and in a few days he'd received over 60 responses from people around the world. He discovered that those around the world who had gotten access to the Internet were excited about what it made possible. And because they had found that it was something of value, they wanted to contribute to it and to help others get access to it. What Hauben found was that there was a new form of citizenship emerging from the experience of those who were participating online. One of the conventions used to refer to the Net or to those related to Usenet as net.xxxx with xxxx being the term you were referring to. People occasionally talked about a net.cop or a net.citizen. Michael contracted net.citizen into netizen. The concept of netizen that he had discovered was someone who saw himself as a citizen of the net. This described the people online who were doing what they could to contribute to the discussion or other needs of the developing Internet and were active to spread it to others. Further research that Hauben and others did revealed other examples of this new form of participatory global citizenship that was emerging from the development of the Internet. There are many examples of this new form of citizenship being developed and taking on some of the important challenges of spreading the Internet to all. Following are a few brief examples of the achievements by netizens: 1) The NTIA online conference held by the U.S. government in November 1994 on the question of universal access to the Internet. After there was protest against the privatization of the U.S. backbone to the Internet, the U.S. Department of Commerce decided to hold an online conference to discuss the issue of universal access. A vibrant debate over the privatization occurred in the online conference. Many people explained why the privatization was a poor policy decision and that it would impede the spread of Internet access, rather than facilitate it. Others supported the privatization. But the dominant sentiment was that the U.S. government shouldn't change its role in Internet development until it had a plan for how to make access available to all. Though the online conference didn't stop the privatization, it made a record that there was significant public opposition to the privatization policy. And it demonstrated that the form of an online conference was a valuable means of exploring difficult but important public policy issues. 2) The Intel Story Another example of netizen activity was demonstrated by the way online discussion on a Usenet newsgroup was able to uncover a bug in the first Intel Pentium computer chip. When newspapers reporters tried to ignore the problem, online discussion by users not only brought the problem to the attention of the public, but they also challenged reporters who tried to make excuses for the problem. 3) Communications Decency Act. When the U.S. Congress passed the Communications Decency Act, vigorous discussion online on Usenet, and on mailing lists condemned the law. And numerous web sites were blackened in protest. Judges hearing the court challenge to the law wrote a strong decision criticizing efforts by the U.S. government for trying to restrict the "global conversation" that the Internet makes possible. An important example of the power of the netizens is the 1996 Federal District Court decision in Pennsylvania overturning the Communications Decency Act (CDA). 4) The U.S. government anti-trust decision about Microsoft. A more recent example of netizenship helping to challenge unbridled power is demonstrated by the recent finding by a U.S. court that Microsoft is guilty of violating the U.S. anti-trust law. The online development of an alternative and better operating system by Linux programmers around the world, along with the online discussion of the problems with Microsoft, helped to provide an environment where the U.S. government has been pressured to apply its anti trust laws to Microsoft's activity. 5) The ICANN challenge to the future of the Internet A new and greater challenge has recently developed for netizens who care about the development of the Internet and the fulfillment of the future promise that there will be access for all to this new participatory medium of global communication holds. I learned about the problem of ICANN in the spring of last 1998 from a Japanese mailing list that I had been invited to participate in. The U.S. government had posted a rule making procedure on the web stating that they were going to give key functions of the Internet to the private sector removing them from public ownership and protection. These functions included the IP number system for providing a unique number to computers on the Internet to make it possible for users to send and receive messages across the diverse networks of the Internet. It included the domain name system and root server system. This system provides the network and computer names that users use like xxxx@columbia.edu or xxxx@citizen2000.net. A statement by the U.S. government called the Green paper had been put online at a U.S. government web site by the U.S. Department of Commerce. After learning that there was only a day left to comment on it, I was able to access it, copy it, read it and write a response. The Green paper presented the Internet solely as a means for e-commerce and provided no means of supporting the global communication that is the important general function of the Internet. I responded with a critique of the Green paper which I sent to the comments section on the U.S. government web site. I also posted it on relevant Usenet newsgroups and mailing lists on the Internet. A number of people wrote me including someone from the American Library Association, and from a newspaper for local governments. They asked me if they could reprint my response in their publications. I later learned that the U.S. government did not want to summarize the comments as required by a rule making procedure and instead dropped the rule, but tried to continue with the privatization. Subsequently there was a meeting of the Internet Society in Geneva, Switzerland. The U.S. advisor for policy to the President, Ira Magaziner announced that the U.S. government was planning to give the DNS system to the private sector (whatever that meant). When I tried to talk with Magaziner about why this would be harmful for the public, he told me to send him e-mail. After I returned home, I sent several e-mail messages to Magaziner and finally got an answer. Also after several e-mails he agreed to talk with me by phone. In response to my questions, Magaziner told me that there were two problems he was solving with his privatization plan. 1. The complaint by some trademark holders that they weren't being protected adequately from others getting domain names that were similar to their trade marks. 2. The desire of the International community to participate in the administration of the Internet and its essential functions. When I told Magaziner my objections to the U.S. government plan to privatize essential functions of the Internet's infrastructure, Magaziner told me I would have to give him a proposal putting my objections into operational form if I wanted him to consider them. I felt that the second problem, the desire of countries around the world to participate in Internet development and administration, was the primary problem to be taken up and that the trademark problem could only be solved after understanding the other problem. I wrote a proposal to create a collaborative scientific prototype to document the administrative functions to be taken on, and to create an open online process to involve the online community in developing this prototype. Also I proposed that a task of this cooperative effort would be to identify the problem to be solved, and the vested interests to be identified who would make it hard to solve the problem, and to make a proposal toward determining a solution. I spent a week writing a proposal based on my research on how Usenet spread abroad via a cooperative process of development among members of the European, Australian and Asian Unix communities. The process that I was proposing, of a collaborative international activity to identify the problem, was a process I felt would provide a prototype to help to solve the problem Also Magaziner indicated a concern of industry that different countries would pass different laws related to the Internet. Magaziner promised me a response to my proposal, however, instead of his calling me to discuss it, a few weeks later I heard from Becky Burr, in the NTIA in the U.S. Department of Commerce. Meanwhile the U.S. government contractors who were then administering the domain name and root server system functions, a company called Network Solutions Inc. (NSI) and the Internet Assigned Names and Numbers Authority (IANA) were negotiating under U.S. government oversight to create a private entity to provide for what was called "industry self governance" of these controlling functions of the Internet. NSI at this point in time was owned by SAIC, a large privately held defense contractor corporation formed originally by a number of people from the U.S. Department of Defense. It held the NSF contract for administering domain names. IANA was created by the Information Sciences Institute (ISI) at the University of Southern California, who had a DARPA contract to administer significant parts of the Internet's infrastructure. Both DARPA and the NSF are U.S. government agencies. ISI had been created through contracts with DARPA. Supposedly negotiations between NSI and IANA broke down and a proposal was presented by IANA to form a private sector corporation to carry out the privatization of these publicly owned and controlled functions. IANA's proposal was prepared by a supposedly pro bono lawyer from one of the largest U.S. corporate law firms. How a proposal prepared by a U.S. government contractor was a private sector proposal is still a mystery to understand. But perhaps the fact that it is illegal according to U.S. law for the U.S. government to create a private sector entity to conduct government functions can help account for the Orwellian nature of the terms used to describe a public entity as the creator of a private sector proposal. (Also the head of IANA during this period had been threatened by Magaziner with prosecution in connection with a dispute and activity that had developed over whether NSI or IANA would control the DNS root server system.) The IANA proposal was to privatize essential functions of the Internet's infrastructure. These would be put under the control of a board of directors. A private sector non profit company was to be created under California's non profit corporate law. The company would be called the Internet Corporation for Assigned Names and Numbers (ICANN). The U.S. government was to transfer to this corporation essential functions of the Internet, including the root server system, the IP numbers, the domain name system and the protocol development process (IETF). Originally there were three proposals submitted to the U.S. Department of Commerce, the proposal I had been asked to submit to Magaziner, the IANA proposal, and an alternative proposal for a private corporation that was submitted by several who had been involved in the IFWP mailing list who called themselves the Boston Group. Later a 4th proposal was also submitted. The U.S. government allowed a very short period of time for public comments on the proposals and then declared the IANA proposal to create ICANN as its choice proposal. All the government did to consider the proposal I had submitted was to have a U.S. Department of Commerce official call and talk to me on the telephone for about 20 minutes. She asked if there was something from my proposal that could go into the IANA proposal to represent my concerns. When I said that my proposal required government support for scientists and collaborative scientific activity, she didn't explore why that was true, but ended any contact. Afterwards she sent me an e-mail message thanking me for my "constructive participation." During this period there was a hearing in the U.S. Congress, held by the U.S. House of Representatives subcommittee on Basic Science and the Subcommittee on Technology about what was happening with the DNS system privatization. Some of the people opposing the ICANN proposal tried to contact Congressmen on the subcommittee or their staffers. When I asked the staffers if I could submit testimony in the hearing, I was told that the committee would then have to let everyone submit testimony. They asked me to submit questions that the Congressmen could ask of those they had invited to testify. I submitted several questions including the question "by what authority was the U.S. government transferring these publicly owned and controlled essential functions of the Internet to the so called 'private sector'"? I maintained contact with the staffers, often able to use e-mail to do so, along with the telephone. Two days before the hearing I was told that I could submit testimony into the record. I sent testimony via e-mail (which became part of the published record of the hearing) and I also attended the hearing. Several others who were on the IFWP mailing list also attended the hearing and also submitted questions or testimony that was later included in the published record of the hearing. 6) Mailing list and ICANN An important means for participation in these issues has been online mailing lists and Usenet newsgroups relevant to the topics There have been posts and sometimes discussion on these online forms about what has been happening with the plans of the U.S. government to carry out this privatization of the essential functions of the Internet's infrastructure. On the Netizens Association Mailing list there was an ongoing long term discussion of the need to let the public, both those online and off, know about what has been happening in the Internet privatization process the U.S. government is carrying out. For quite a while the U.S. press articles on ICANN activity were only press releases for the U.S. government plan. Finally, after a large meeting in November 1998 in Cambridge, MA, where there were many questions asked about the privatization and much protest expressed about what was happening, there were a few accounts reporting that there was criticism of ICANN published in the online and even in the print press. After a number of posts on the Netizens mailing list about problems with the creation and development of ICANN and the way it treats users, there was a serious discussion about the need to break through the lack of media coverage of the problems with ICANN. A Hungarian freelance writer John Horvath wrote a long and detailed article about the problems of the privatization and of the lack of information for the public about what is being done. Horvath's article was printed in an online German journal Telepolis. Included in the article was a criticism of the way the European Union officials involved were not protecting the public which was similar to the criticism about what U.S. government officials were doing. A European Union official wrote complaining about the article and saying that the writer needed to do more research on the European position about the privatization. The fact that there was such dialogue, which even got printed in the forum section of an online journal, was important. Horvath's article was referred to broadly online. (It was reprinted in the Amateur Computerist. See: http://www.ais.org/~jrh/acn/ACN9-2.txt ) Other mailing lists carried this discussion. One such mailing list was the mailing list known as the IFWP mailing list. This mailing list was mainly made up of people who favored the privatization but had disagreements about how it was being carried out. However, there has been a long and sustained discussion of some of the issues on this mailing list during the course of the 1998-1999 period. The mailing list has now been ended. Another mailing list carrying some discussion of the ICANN controversy was the Telecom Digest which was a moderated mailing list and also a moderated Usenet newsgroup. The moderator of this mailing list, Pat Townsend, had directed the mailing list for a number of years and it was highly regarded by many online. Townsend expressed his concern about what was happening and felt that those who are online know what was going on and that they have a chance to consider the effect that ICANN's privatization may have on their future net access. He posted some of the articles sent to him by the critics of ICANN and requested that ICANN advocates like Vint Cerf or others respond. He also received some responses he was told he couldn't post. During this period, the International Telecommunications Union (ITU) was providing some minimal financial support for the mailing list. An official of the ITU wrote to the list, expressing his displeasure with the digest carrying discussion critical of ICANN. The official indicated that the problem was that the critics weren't reliable and that any time one does something there will be criticism. He suggested that if the mailing list continued to carry such discussion, it would put in jeopardy the funding that was received from the ITU. In September, 1999, the organization Computer Professionals for Social Responsibility (CPSR) held a conference and invited a few of those opposing ICANN and ICANN advocates to be speakers at the conference. The conference was sponsored by the Open Society Foundation (Soros Foundation) and the Marino Institute Foundation. CPSR also invited Ralph Nader, who presented a proposal for a multilateral agreement of different nations to support ICANN. A response to Nader's proposal was posted on mailing lists critiquing it for not challenging the way that the U.S. government had created ICANN to transfer essential functions of the Internet's infrastructure from the public sector to the control of private entity created illegitimately by the U.S. government. Also the critique challenged Nader's claim that online users are to be regarded as consumers. Portraying users and netizens as consumers limits their rights and their ability to function as netizens, presenting them instead as those who are involved in buying what others sell. There have been a number of other important developments in the ICANN controversy. There have been letters sent to executive branch officials by U.S. Congressmen asking for explanations of behind the scenes government activity to create ICANN and to form the Government Advisory Council (GAC), an advisory body of government officials to ICANN. Congress asked the U.S. General Accounting Office (GAO) to investigate the secret process which resulted in the choice of the interim board members for ICANN. The GAO was asked for an opinion on the authority of the U.S. government to create ICANN and to transfer public property to ICANN, along with the authority to fund U.S. representatives to attend the GAC meetings. Other government agencies have become involved in trying to challenge ICANN's closed and arbitrary structure. For example, advocates from the U.S. Small Business Administration (SBA) have complained to ICANN about the lack of procedural rights for small business owners and others to participate in ICANN's activities. A few recent books have been published which note the problem of privatizing the public Internet functions. (See, for example, Rich Media, Poor Democracy by Robert McChesney, University of Illinois Press, 1999, pg. 134) What are the lessons one can draw from the experience of the past year and a half participating in the fight against the privatization of essential Internet functions? Is the Internet a laboratory for democracy? I have found that the Internet provides for important ways for citizens to participate in and extend democracy. Also, in the process of participating online, I have learned something about the nature of democracy. I have been able to communicate with other citizens in the U.S. and netizens around the world on issues of public concern. I have learned how the principles behind the creation of Usenet and of the Internet are important democratic principles. Usenet was created to make it possible for people to communicate. The Internet was created for a similar reason, but phrased in a slightly different way, i.e. to remove the constraints on communication. It was also created to facilitate resource sharing across diverse networks. I have gotten help and support from netizens abroad to be able to be a citizen at home. I have gotten help from other citizens in the U.S. to be able to contribute to netizenship abroad. There are new democratic forms and concepts being pioneered by those concerned with the development of the Internet and of Usenet that will help in the battles. I have come to the conclusion that the Internet is a laboratory for democracy. Those who are willing to contribute in this exploration will contribute to the further development and spread of the Internet and will gain in their ability to be better netizens and more effective citizens. But it isn't easy and we need improved ways to support each other and to work together. In summary I want to describe a recent interaction that the Internet has made possible. In the process of taking up the challenges of the ICANN controversy I was invited onto a mailing list. For a while posts to the mailing list were encouraged, but after challenging Nader's plan to represent users as consumers, moderators of the mailing list said they weren't going to post much on ICANN any longer. They continued to post the articles they wrote, but they didn't post another post that I sent, even when it was not about ICANN. Also I had a talk I had planned to give cancelled. I wrote a post about how I had previously had other talks I was scheduled to give cancelled, and how articles I had been invited to write for publications, including a publication by the Internet Society, were subsequently pulled from publication. The moderators of the mailing list that wasn't posting my articles wouldn't post this, but I posted it on another mailing list that I still had access to. Someone from Norway wrote me in response, describing the frustration in his country with the U.S. corporate effort to dominate around the world using the Internet as the mechanism for e-commerce. Also he described some of the activity of those in the Linux movement in different countries to create an alternative to Microsoft's operating system. He raised the question whether something like that is needed for the Internet as well. In the process of the discussion with him I was reminded of Licklider's vision for the future of the network. In an article published in 1968, written with Robert Taylor, Licklider and Taylor wrote about the vision for the network that was only just being planned. They wrote (1): For the society, the impact will be good or bad depending mainly on the question: Will 'to be on line' be a privilege or a right? If only a favored segment of the population gets a chance to enjoy the advantage of 'intelligence amplification,' the network may exaggerate the discontinuity in the spectrum of intellectual opportunity. On the other hand, if the network idea should prove to do for education what a few have envisioned in hope, if not in concrete detailed plan, and if all minds should prove to be responsive, surely the boon to humankind would be beyond measure. Unemployment would disappear from the face of the earth forever, for consider the magnitude of the task of adapting the networks software to all the new generations of computers coming closer and closer upon the heels of their predecessors until the entire population of the world is caught up in an infinite crescendo of on-line interactive debugging. The Linux movement provides a material example of those carrying on Licklider's vision as they collaborate and work together to debug the developing software to make it possible for the Internet to spread and develop. But Usenet and Internet pioneers I have known have taught me that there is another form of debugging that is equally important.(2) That debugging is to identify and solve the problems of the Internet's continuing development. Just as the Internet provides the means to participate in the creation and development of Linux, similarly it also provides the means to participate in the creation and development of the administration of its political and administrative infrastructure. This is in some ways a harder challenge, but to fail to do so is to leave the vested interests free to stifle and then end the future development of the Internet as a two-way interactive communications medium. They want to replace it with a centrally controlled and e-commerce directed commercenet.(3) Their slogan is "making the world safe for e-commerce."(4) Netizens need a slogan as well, one which will indicate the need for the continuing interactive participation of users in the growth of the Internet and of the democratic participation of citizens and netizens to solve the problems of present and future Internet development. Perhaps such a slogan is "the Internet is a laboratory for democracy for ever more participatory debugging to identify and solve the problems of future Internet development." -------------------- Notes (1) From In Memoriam: J. C. R. Licklider 1915-1990, Aug. 7, 1990, p. 40, reprinted by Digital Research Center; originally published as "The Computer as a Communication Device," in Science and Technology, April, 1968. They also write: "First, life will be happier for the on-line individual because the people with whom one interacts most strongly will be selected more by commonality of interests and goals than by accidents of proximity. Second, communication will be more effective and productive, and therefore more enjoyable. Third, much communication and interaction will be with programs and programming models, which will be...both challenging and rewarding. And, fourth, there will be plenty of opportunity for everyone (who can afford a console) to find his calling, for the whole world of information, with all its fields and disciplines, will be open to him, with programs ready to guide him or to help him explore." (2) Examples of such debugging of problems includes the role played by Mark Crispin on the TCP digest in the 1982 period before thecutover to TCP/IP on the ARPANET. Crispin noted that TCP/IP was a good protocol but that milestones for the cutover had been planned even though the needed implementations for the PDP-10 computers hadn't been developed. Similarly, on early Usenet a number of the Usenet pioneers encouraged open discussion of problems and changes as they maintained that Usenet was a users' network and unless users participated in the decisions they wouldn't be decisions that were good decisions. (3) ICANN is an example of creating a centrally controlled management form to centralize control over the Internet in a few private hands. The U.S. government claimed that it would transit the publicly owned central functions of the Internet's infrastructure to a privately owned and controlled ICANN by September 2000. They did not succeed in doing so and as of Spring 2001, the U.S. government is still involved in the contracts with ICANN determining the administration of these functions. (4) The General Accounting Office (GAO) report about ICANN and the U.S. Department of Commerce, issued in July 2000, noted that there was a problem with the U.S. government plan to privatize the publicly owned functions of the Internet's infrastructure. The report pointed out: "Under the Property Clause of the Constitution disposal of government property requires statutory authority. U.S. Constitution, Art IV, SS 3." (pg 26). The report noted that the Executive branch of the U.S. government could not just transfer public property to a private company. That there were laws and constitutional obligations regarding federal property and its deposition. -------------------------------------------------------------------- [3] Ford Model E Program by William Rohler wrohler@peoplepc.com On February 3, 2000, Ford Motor Company Chairman Bill Ford, Chief Executive Officer and President Jac Nasser, and UAW President Stephen Yokich announced that Ford would offer all of its eligible active employees a computer, printer and Internet access for home use for a fee ($5.00 per month in the U.S., for example) for three years. All active, full-time hourly and salaried employees of Ford Motor Company worldwide including Ford Credit and Visteon are eligible. "Technology and the Internet, in particular, are changing the way we do our business," said Jac Nasser. "Providing home computers for employees is a tremendous step in the right direction of connecting all of our employees with what's going on with the company the way we run the business and the way we communicate with our markets." "The intent of this program is to bring the capability of our employees up to the highest level," said company Vice President and Chief Information Officer Jim Yost. "In order to do that, they have to have access to the Internet. Not only to learn more about Ford, but about the customer and e-commerce in general." The program was first called Ford Employee Connectivity Program (FECP). But on July 28, 2000, it was changed to the Model E program. "Model E is the modern equivalent of Henry Ford's $5 a day wage a breakthrough approach to empowering a workforce," said Jac Nasser during a speech on July 27 at the National Press Club in Washington, D.C. The computer is a Hewlett-Packard with an Intel Celeron 500 megahertz (MHz) processor, 64 megabytes (MB) of RAM, 4.3 gigabyte (GB) hard disk drive, a CD-ROM player, a 15 inch monitor, speakers, a modem, software, a HP 640 Color Inkjet printer, and Internet access from UUNET, an MCI WorldCom company based in Fairfax, Va. PeoplePC is coordinating the overall program for Ford. The software package will include word processing, spreadsheet and antivirus programs, and other extras, including Encarta encyclopedia and Quicken financial software. The program is optional and the computer package will become the property of the employee and will be covered by a three-year parts and service warranty through PeoplePC. Employees will not be restricted in what Web sites they are able to access nor will they be monitored by Ford or by PeoplePC. Employees will access the Internet through a special portal that will allow them to customize their options, preferences and shortcuts. The portal will offer direct links to many Ford services and information, and it will be customized for different regions of the world. "When employees choose to use the portal, they can get work-related information," explained Yost. "But we're not limiting it to that use. We want them to get on the Web and use it like our customers would." Every employee will get two e-mail addresses and tools to construct his or her own Web site. The program is not available for retirees and part-time employees. There are upgrade packages for people that want to pay extra for them. Model E program Director Steve Paschen said "Each country we go to has a different tax structure. In some countries we go to, this is considered a benefit, and certain countries tax benefits, and they either tax the employee and/or they tax the employers." That raises the cost of the program to potentially prohibitive levels and has slowed the rollout. However progress is being made in several countries, including France and Germany, which should soon permit Ford employees there to begin logging on. Paschen also said "In the U.S., people have just gotten their computers within the last few months. They're just getting online; they're just getting comfortable with it. So I think we've got a huge opportunity in front of us to start to really provide them the tools and the information they're looking for." "Those are the kinds of things that I think we can really have a lot of fun with and we can really make a huge difference," Paschen concluded. ---------------------------------------------------------------------- [4] The 1984-1987 Battle over Computer Classes This is an historical account of the fight that developed over worker access to computer programming classes at a large auto company in Michigan in 1983-1987. This story contains valuable lessons about the problem U.S. workers face in trying to obtain education in the workplace. These events occurred at the Ford Motor Company's Dearborn Engine Plant. Schoolhouse in the Factory The story starts with the massive layoffs in the auto industry in the early 1970s. In response, workers determined that they would fight for shorter hours of work so that more workers could be employed. From 1973-1979 U.S. auto workers won shorter working hours in their contracts in the form of individual days off, called 'paid personal days'. Together with the reduction in hours of work, the auto companies undertook major investment programs to update their technologies. Describing this in a 1994 talk, one Ford management spokesperson explained: By the end of 1983 the North American auto industry had spent an estimated $80 billion on retooling and renovating its manufacturing and assembly plants (more money, by the way, than it took to put a man on the moon). The Dearborn Engine Plant has participated fully in this industry-wide revolution. Over a two and one-half year period, 1978-1981, we spent more than $590 million to transform the plant from an antiquated producer of V-8 engines into one of the most modern four-cylinder engine manufacturers in the world. And the improvements continue. Last month we completed the conversion of our plant from a producer of 1.6 liter to 1.9 liter engines.... In 1980, we installed state-of-the-art automation that was hard-line, or not easily adapted for new applications. Since 1980, we have increased dramatically our deployment of robots and flexible automation units. By 1990, we expect to have 70 such units.... Along with this new technology, the 1982 UAW-Ford contract included a paid education benefit for auto workers. Under what was called the Nickel Fund, workers gave up a raise of five cents per hour to contribute to an education fund. Describing this fund, the same Ford official explained: At the Dearborn Engine Plant our education facility includes the UAW-Ford Employee Development Center, which teaches basic literacy skills and high school equivalency courses and the Learning Center, which provides basic and advanced technical training. A basic reference document for this and subsequent contracts was a University of Michigan evaluation report. The report described the creation and development of the Employee Development Center at the Dearborn Engine Plant, or what was called less officially the Schoolhouse in the Factory. The study explained that Ford workers desired "education" as opposed to "training" and distinguished between the two. Addressing workers' views on education, the report said: An analysis of their remarks reveals that no matter how stated, regardless of context, and despite specific topic of conversation, these individuals believe that education (as distinguished from 'training') can liberate them, can enrich their lives, can be the vehicle which will allow them to do and accomplish things they believe are important to them. Education has an irresistible appeal. While many of the participants spoke of the 'utilitarian' implications of education, what was most evident was how deeply they felt about the 'meaning' of education. Education represents an idea, a touchstone which literally has become a matter of faith. (...) In their remarks, these men displayed a very sophisticated ability to distinguish between 'education' and 'schooling'...The single statement which perhaps best conveys this message came from a man who is rapidly approaching retirement, 'Overall, I just think it's one of the best things that's happened to Ford's and I've been here 15 years....to have a set-up like this where you can right here on the job you can do anything.' The report suggests that workers enrolled for both practical reasons and broader purposes. It explains: Participants describe their reasons for enrolling in such terms as 'I wish to improve myself'...'I'm looking ahead'... At the same time the participants reported that education is essential for gaining insight into their lives and providing direction for the future. When discussing reasons for participation, the participants invariably indicated that the decision to enroll was a personal choice an act taken independent of any consideration related to company or union interest in the EDC.' The fifty percent drop-out rate that occurred at the center was similar to what occurred in adult education across the U.S., but the report states "No one reported withdrawing because of unhappiness with the program or staff or because educational expectations were not being met." Reasons given for choosing the DEP program were "the ease and convenience of continuing their education at an in-plant educational facility. Participants reiterated the theme constantly. Many participants acknowledged that they could have gone to their local public school program and received similar services but it was 'too much trouble.' Being able to go to the Center before or after work or during lunch "was a powerful inducement leading to enrollment." The report also explained "a clear orientation to learning is present among the participants. While this is not to deny the validity of utilitarian outcomes, most enrollees hold a broader view of the meaning of their participation in the program." Among the reasons for participating was helping children more readily with their homework. Also, "participants sense that enrollment in the program will help them become more flexible regarding future employment and they feel that education is necessary to help them keep up with the changing technology of their jobs." The report continues, "Participants constantly expressed concerns about the future, about the need to be prepared, to be able to cope with an increasingly complex society and a constantly changing work place. Education was viewed as the basic means for preparing for the future and for sustaining an orderly transition into the future." Referring to the computer classes offered at the Schoolhouse in the Factory, the report explained that "participants in the computer classes are primarily skilled trades workers with at least a high school diploma, and usually some advanced training." It said, "Participants in the computer classes, while commenting favorably on the class, frequently expressed the opinion that too many enrollees were admitted for the number of computers available...." Concerning the teaching staff, it found that "Participants believe that staff members view and treat them as self-reliant, autonomous adults, an attitude they frequently contrasted with the way they were viewed and treated in their roles as workers...." Among the study's conclusions were: * The response to the computer courses was enormous. It would make sense to have these courses ready to go when a center opens to attract attention.... * More course offerings for workers with higher educational skills. Many of the skilled-trades people we interviewed expressed an interest in further educational programs though the EDC for the same reasons as production people enrolled --proximity, convenient hours, pleasant surroundings etc.... Ford received this evaluation in June 1984. A new contract incorporating these recommendations was prepared to govern the period of September 1984 - June 1985. The school established under this contract employed a full-time program specialist and three certified teachers assigned to the basic skills program, each working approximately 22 hours per week. Further, a computer programming teacher offered two courses: Computer Literacy I and II. Although the course title emphasized 'literacy', these courses were at reasonably difficult levels. For example, after requiring familiarity with BASIC, the course description for Computer Literacy II read: "Topics covered will be...nested for/next loops, one and two dimension arrays, writing programs, on error statement, trace and no trace, bubble and binary sorts, flow charting, math functions, string functions and data types, sequential and random access files, hi resolution graphics and shape tables, an introduction to the Apple's Monitor Mode." Facilities were small, with one computer room equipped with several computers. Rouge workers greeted the computer classes enthusiastically. There was much interest in computers, and especially in programming. Popularity was such that workers recommended classes to their fellow workers and the program grew. Interest was sufficient to open summer classes in 1985. Also, workers requested that additional advanced classes be offered, that there be a time when the computer classroom was open outside of class time, and that there be an instructor available in a lab setting so they could come outside of class or if they had to miss a class. Visitors from around the U.S. and the world frequently visited the Schoolhouse in the Factory and the computer classes. Decline, Resistance, and Shutdown In Fall 1985 the conditions at the Schoolhouse in the Factory suddenly changed. At first, union and company officials wanted to know what was being taught in the computer classes. The Schoolhouse director showed them syllabi and the class text. Then the director told staff that they would not be allowed to distribute a brochure she had prepared announcing the computer classes, along with the other course offerings, throughout the Rouge plants. This brochure, called "It's Your Nickel", was only to be distributed inside the Dearborn Engine Plant. She was to create a different brochure to distribute Rouge-wide that could not mention the days and hours when computer classes were to be offered. Further, the union newspaper would include the computer listings at the Dearborn Engine Plant when its new issue came out, at a date uncertain. But the union newspaper appeared with only a vague notice of the computer classes, and several classes were cancelled as a result. From then on until classes ended in February 1987, there was a battle to continue the computer classes. On May 13, 1986, the following petition was sent to UAW Local 600 office: Chairperson at the Dearborn Engine Plant: May 13, 1986 We, the students of the computer training classes at the Dearborn Engine Plant training facility, have been informed there will be no summer classes and possibly no fall classes. There are at least 29 people interested in summer computer classes. And as many interested in fall classes. We, the students of this computer class, would like to know why it is so hard to continue education in computers. We have been experiencing for the past two or three semesters frustration in continuing education and advancement in computer training. When polled about advanced classes, we desire them, but then they are not offered. We would like to know why they are not offered because we want to continue and advance. (It was also printed in the union paper which led us to believe there were summer classes available to computer students.) We await your answer so that we may register for summer classes when they are offered. Concerned students of the computer classes, (signed by over 20 students) Also, computer students wrote, passed out, and posted a leaflet at the Ford Rouge Plant. The leaflet said: UAW members have been fighting for 1-1/2 years against attempts to cut out the classes in computer programming held at the DEP. UAW members contribute 17 cents an hour straight time and 50 cents an hour overtime to have these classes available. The most critical point for UAW members is to have training in high technology. How can UAW members be trained in high technology by cutting computer classes out? We contacted the Chairman in the Engine Plant, and he didn't give any result. We contacted the management officials in charge of training in the Engine Plant. We contacted the President of Local 600, and the officials in charge of the program at Ford Motor Co., and at the UAW. We sent letters everywhere. We are tired of being denied benefits we're entitled to. We're tired of being shuffled from one person to another so as to cover up who we're fighting. We don't know what classes are being offered from one course to the next. We ask for programming in BASIC and they offer PASCAL. We ask for PASCAL to be continued, they offer advanced BASIC. There are no rights to grievance how the monies are being spent. But the letter of Understanding (in the 1984 UAW-Ford Contract) says: 'In view of the Company's interest in affording maximum opportunity for employees to progress with advancing technology, the Company shall make available appropriate specialized training programs for employees.' But this is not being provided... Despite the efforts of workers to make the problems known to Ford management and union officials, and despite efforts to protest the ever-worsening conditions via student and staff letters, those contacted refused to investigate the problem. Instead, students and staff faced retaliation threats and job harassment. By February 1987, no further computer classes were scheduled at the Schoolhouse in the Factory and classes ended. Realizing that computer classes would no longer be available, several students and their teacher decided to work on a newsletter, the genesis of the Amateur Computerist. As our first issue in February 1988 explained: This newsletter is to inform people of developments in an effort to advance computer education. Workers at the Ford Rouge Plant in Dearborn, MI were denied computer programming classes. There was an effort by administrators of the UAW-Ford program at the Dearborn Engine Plant to kill interest in computers and computer programming. We want to keep interest alive because computers are the future. We want to disperse information to users about computers. Since the computer is still in the early stage of development, the ideas and experiences of the users need to be shared and built on if this technology is to advance. To this end, this newsletter is dedicated to all people interested in learning about computers. ---------------------------------------------------------------------- [5] The State of the Net in Hungary by John Horvath jhorv@helka.iif.hu As the seconds tick the time out for the second millennium, Hungary is still playing catch-up on the long and winding infobahn. High telephone charges coupled with metered rates for local calls make domestic access still a luxury for many. In addition, the country's digital infrastructure is still inadequate to handle large volumes of traffic and high bandwidth applications. Yet despite these and many other shortcomings, Hungary has made some progress over the past few years. The Internet has finally broken out from its isolation as a seedy and potential dangerous place for youngsters and society at large. Indeed, even the extreme fringes of the political spectrum now have a presence on the Internet. The Internet as a source of mass media has gained ground in the past year, albeit still very slowly. Conventional media radio, television, and print have increasingly made references to the "new" media. In fact, many have their own online presence. Shows dealing specifically with the Internet have also been on the rise. As for e-commerce, although still in its embryonic stages, it has started to become more prominent. This year saw a big boost for the commercial Internet as the country's largest savings bank, OTP, launched an array of online services. This has taken place in conjunction with the rise of other business activities, like ordering a pizza online. As a result, advertising is beginning to spill over from "cyberspace". Many advertisements placed within traditional venues now include a web site or e-mail address. Coupled with all these advances, there has been an exponential rise in native language content. This is directly related to the growth in user demographics which, although still well below the European average, not to mention North America and Japan, has risen substantially. The latest demographic figures from IDC show that there are 650,000 Internet users in Hungary and this is expected to increase by almost 30 percent in the next three years. Much of this can be attributed to the government's effort at wiring the schools to the Internet. Known as Sulinet, the program has introduced many students, teachers, and administrators to the world of computers and networking, and has offered them an opportunity to go online that they otherwise would not have had. There have also been several private sector initiatives at broadening the user base. Cable access has made its appearance, providing more reliable service and higher bandwidth. Not only this, but with cable threatening the ISP position of the country's leading telecom provider, MATAV, the access market has become more competitive, to the benefit of consumers. In addition to this, the post office has been busy establishing "telepost" offices in various communities. In conjunction with usual postal services, these offices enable people to use computers and the Internet, providing them with e-mail and a host of other services. A total of 17 such offices are presently scattered throughout the country, with plans to open another 30 offices next year. Although the progress the country has made over the year to bring the Internet to the average citizen is noteworthy, it is still far too early to proclaim that the Internet revolution has "taken off" in Hungary. On the contrary, the country still faces many challenges. Unless these are addressed, the potential of the Internet will be stagnant. One of the major problems still faced, not only by Hungary but other countries of Central and Eastern Europe, is that the area is still being used a dumping ground for redundant technology. The new computer system at OTP, for example, which was purchased and implemented in the mid-nineties, is outdated by at least a decade. This impediment of redundant technology, due either to ignorance or economic considerations, is not limited to merely Hungarian enterprises, however. A Dutch bank operating in Hungary, which last year implemented a new retail card system, only found out at the beginning of this year that its new system was not Y2K compliant. On the commercial side of things, although the presence of the Internet is obvious in advertising and marketing strategies, Hungarian companies (especially SMEs) still are unable to see the advertising potential nor fully grasp the dynamics of online advertising. On the other hand, those that do are often behind the times, perpetually caught in a cycle of playing catch-up with western trends. For instance, although many companies have now begun to make a shift toward the Internet, the new trend in the U.S. is to actually "flee the dot-com". As Keith Dawson writes in his weekly log, Tasty Bits from the Technology Front (see: http://tbtf.com/blog/1999- 11-07.html as well as http://interactive.wsj.com/articles/SB942276734846706339.htm and http://www .msnbc.com/news/333919.asp), "focus groups are beginning to show that average folks don't remember the companies, don't like the ads, and resent the ever-present image of the greedy twenty-something zillionaire." Meanwhile, telework remains a remote and wishful concept. Despite increased traffic congestion and pollution in most of Hungary's major cities, especially Budapest, it's not economically feasible to have people work from home, given the poor state of the telecommunications infrastructure -- not to mention the cost. Moreover, most Hungarians still work along lines of an industrial and agrarian economy, as opposed to a knowledge-based one. As for e-commerce, while making a grandiose appearance, it's caught in an awkward predicament. To be sure, e-commerce in Hungary will grow but, if present trends continue, its influence will be limited. The main reason is that many are wary of initiating a system for serious online transactions. Even non-monetary transactions, such as booking and reservation services, are not widely available. This is because there lingers a fear and mistrust of online services. For instance, while the ability to order and pay by credit card over the telephone has relatively a long and established tradition elsewhere -- notably the U.S. and Canada it's still a concept very much alien to the Hungarian economy. A less than extensive user base is an additional problem. Hungary remains one of the most expensive places in Europe for Internet use. Although the increase in the number of users may look impressive, it still represents less than 7 percent of the population, with only 14 percent of all PCs in Hungary connected to the Internet. While efforts have been made to get more people online, access is still hindered by high telecommunication charges. This also goes for cable, which costs about a quarter of an average Hungarian's salary. A study commissioned by the OECD confirmed that high connection fees coupled with the high cost of local telephone calls is impeding the uptake of the Internet in Hungary. Unfortunately, this situation looks set to worsen, with a 20-40% rise in telephone charges expected in the new year. Alternative efforts to entice more people online, such as the post office's telepost offices, are not only expensive but also suffer from inconsistent and lopsided development. In the Galga valley, for example, a region about 40 km east of Budapest, a small village has a telepost office while neighbouring towns and villages, which are larger and more strategically located, don't. As for the social aspect of computer networking, here, too, formidable challenges and obstacles exist. While the Sulinet program may have succeeded to a certain extent in introducing many to the medium, students and teachers are, nevertheless, not encouraged to understand the medium, but are taught to simply use it. Similarly, for the community of users as a whole, the concept of a "net community" is lacking somewhat. Most know nothing about ICANN, no less have an understanding nor even interest about any of the issues surrounding the future of the Internet. Another challenge faced by Hungarians embracing the Internet is the view of computer-mediated communications as an alternative source for information. Unfortunately, the Internet is still regarded as a supplement to conventional media, a view that is being reinforced by radio, television, and print. Meanwhile, the old habit of regarding the Internet as a cesspool of anarchy and perversity dies hard. Earlier in the year, a report on hackers was aired on Hungarian television. Instead of presenting a comprehensive view into this sub-culture, with an additional follow-up into Hungary's unique hacker culture, the report turned out to be nothing more than a shoddy play on Eric Raymond's dichotomy of hackers and crackers (see "Homesteading the Noonesphere"), the simplified conclusion being that one group (hackers) is benevolent (they are people who try to find weaknesses in systems) while the other (crackers) are nothing more than a malevolent bunch of people. To this extent, a 3-5 person special group within the police will be established in the new year to deal with "illegal" activities on the Internet. According to media reports, the main purpose of this department is to scan Hungarian sites for pedophilia and bomb-making information which, according to authorities and the media, are the two most "dangerous" types of content to be had. However, as with all such seemingly noble efforts to protect the public from harm, the objectives are vague enough to be used as a means for silencing social discontent and political dissent. Despite these shortcomings, the future is not entirely hopeless; nor will it be entirely mundane. One thing to watch for is the possible rise of Linux in Hungary. The government had already squandered a chance when it had decided on Unix for the Sulinet program. Not that it mattered much, for Hungary still has a vibrant hacker underground. (Admittedly, the efforts of the Business Software Alliance have not gone unnoticed either, as many first time users and administrators in public institutions take the threats of the software police seriously.) With the anticipated release of a Hungarian version of Star Office some time at the beginning of the new millennium, it remains to be seen how Linux will affect the digital landscape in Hungary. As Linux applications become more compatible with commercial (i.e. Microsoft) products, cash-strapped institutions and administrators may seize the opportunities offered by free software. On the other hand, Microsoft's slick and subtle media campaigns over the past year (Bill Gates is regarded by many users in Hungary as one of the main forces behind the Internet) has done much to cement their level of support. At the same time, Linux's unfamiliar and relatively less user-friendly interface are obstacles which still need to be overcome. Other systems, meanwhile, such as BeOS or FreeBSD, are not only insignificant in number but are also unavailable in the local language. For Hungary, the irony of the whole situation is that although the country boasts some of the best talent in the field of computer programming and mathematics, it's not reflected within the general population. Instead of bringing Hungary up to speed on the "infobahn", this level of talent has either added to the country's brain drain syndrome or has taken part in the construction of the multi-tier "information society" which has emerged. Only time will tell if this is a temporary enigma or will turn out to be a chronic handicap. ------------------------------------------------------------------- [6] A Loss for Netizens [Editor's note: The following e-mail message was posted to the Netizens mailing list at the end of Jan 2000. Kerry Miller contributed often to discussions and debates on the Internet and fought for the spread of the Net and its value.] Date: Sun, 30 Jan 2000 22:42:43 -0500 (EST) From: ronda@panix.com Subject: [netz] About a Loss for Netizens On Friday night, January 28, 2000, the Netizens mailing list administrator received a very sad message. The message asked him to take Kerry Miller's e-mail address off the mailing list because Kerry had died on January 18. After asking another mailing list administrator if he knew any further details, we were told that Kerry had indeed died on January 18, of a heart attack after shoveling snow. He was 75 years old. Kerry Miller has been an important contributor to the Netizens mailing list almost since it began. He posted regularly and encouraged others to post by commenting on their posts. I remember Kerry's first e-mail to me several years ago. I told him about the Netizens mailing list. He soon joined and participated actively and often. One time Kerry signed off the mailing list. I wrote him shortly afterwards asking how everything was and telling him about some of the new Internet problems that the Netizens mailing list was concerned with at the time. Kerry resubscribed and contributed again helping to make it possible to have a Netizen challenge to that particular problem confronting the Internet. I didn't know anything about Kerry's life until after hearing he died. I then learned from the moderator of the other mailing list that though he had never met Kerry, he had hoped to meet him several times. That Kerry had moved from Kansas in the U.S. to Canada to marry someone he had met on another mailing list. I will miss Kerry very much. The Netizens mailing list is the poorer for this loss. I hope others will share any thoughts they have about Kerry and that we will all make an effort to contribute a bit extra to make up for the fact that the Netizens mailing list and the Internet have lost one of their important contributors. Ronda Reprinted from Netizens Association Discussion List Digest February 20, 2000, Volume 01 : Number 354 http://umcc.ais.org/~jrh/netizens/digest/Digest_1-354.txt ------------------------------------------------------------------- [7] A Moment of Silence for Michael Muuss [Editor's note: The following appeared on the IFWP and Netizens mailing lists.] Date: Wed, 22 Nov 2000 16:54:56 -0500 (EST) From: Joe Baptista Subject: [IFWP] a moment of silence for Mike Muuss - confirmation? a moment of silence as we honour a network great. I-95 Accident claims life Churchville, Md - (AP) A double accident Monday night on Interstate 95 in Harford County killed a Havre de Grace man. State police say 42-year-old Michael Muuss died when his car hit a vehicle left partially in the road after the first crash. Muuss' car then spun into the path of a tractor-trailer, which pushed him into a vehicle stopped on the right shoulder to help victims of the earlier crash. The truck driver was taken to Harford Memorial Hospital. The accidents occurred about 9:30 pm on the northbound side of the highway in Churchville. The first involved two cars and a tractor-trailer. A driver in that crash was treated at Harford Memorial and released. Police say it's not clear why either accident occurred. No one has been charged, but the investigation is continuing. Traffic was able to get by for most of the night, but it took until 2 am before all lanes were opened. > From: Sean Donelan Subject: The author of PING is reported dead > > Since many network operators consider PING as > one of their essential tools, I thought this would be > of interest to the list. > I haven't been able to confirm this, but I haven't been able to > reach Mike. > > Forwarded message: > > Subject: The Creator of Ping is dead... > > > > Mike Muuss, the author of the PING program used > > on networks everywhere, died last night in a traffic > > accident on U.S. route 95 in Maryland. He was an > > alumnus of Johns Hopkins (BS1978 or 1979 I think). > > Funeral arrangements have not been made yet, but > > I'll probably be going back to Maryland almost > > immediately to attend. > > http://ftp.arl.army.mil/~mike/ping.html ------------------------------- Date: Mon, 27 Nov 2000 15:20:45 -0500 (EST) From: ronda@panix.com Subject: Re: [netz] Fwd: A moment of silence for Mike Muuss It was with a real sense of loss that I read the notice that Joe Pistritto posted on a mailing list last Tuesday. > Subject: Re: IP: With great sadness: Mike Muuss has passed on > Cc: jcp@jcphome.com > > "Joseph C. Pistritto" wrote: > > Last night (Monday), Mike Muuss, famous for > creating the PING program as well as BRL-CAD, > died in a traffic accident at 11pm on U.S. highway 95 > near Aberdeen, Maryland. He was going home from > work at the time. Mike worked his entire career at > the Army Research Laboratories in Aberdeen > Maryland, and was a specialist in first networking, > then solid modeling. Many in the SIGGRAPH > community will know of him because of the > BRL-CAD package that he authored (with others > later) and his animation work which was shown at > several SIGGRAPH conferences. I wanted to add: It is indeed very sad to hear of this great loss to the networking community. There is another important contribution of Mike's to the development of the Internet. He created and moderated the ARPANET TCP/IP Digest which helped in making the cutover from NCP to TCP/IP on the ARPANET in January 1983. The TCP/IP Digest provided a forum in which to discuss the problems that those who were to do the cutover identified so they could be solved. The cutover set the basis for the creation of the Internet as a meta-network of diverse networks. After the cutover, the ARPANET was split into MILNET, an operational network for the DoD, and the ARPANET, a research network. These two different networks were able to communicate using TCP/IP. And that is some of the basis of the Internet as we know it today. A while ago, I wrote a paper about the role the TCP/IP Digest played in the cutover online. The URL is http://umcc.ais.org/~ronda/new.papers/tcpdraft.txt In the research I have done about the early ARPANET mailing lists, Mike's role in contributing to the networking and UNIX communities stands out. His efforts helped to connect these two pioneering communities. He will indeed be missed. Ronda ronda@panix.com ------------------------------------------------------------------ [8] Culture Clash: The Google Purchase of the 1995-2001 Usenet Archive And the Online Community by Ronda Hauben ronda@panix.com Google Takes Over Deja's Name and Usenet Archive A Usenet user in Seattle, Washington was using the Usenet archive at Deja.com on February 12, 2001. He went out to get some coffee. When he returned, http://www.deja.com had changed to http://groups.google.com. This is symbolic of how the online community learned that Google, Inc. had purchased the Usenet archive from Deja.* A number of users expressed their dismay that the purchase resulted in Google taking the Usenet archive off line and substituting an archive of Usenet posts that Google had been collecting since August 2000. Google's beta version of a user interface for the archives was, many felt, quite inferior to what Deja had online. One of the noted problems was that the Google interface didn't have a means to view the discussions that a post was part of (known as discussion threads). Instead the posts were presented individually, in a manner similar to how one might present the results of a web search. An article published in The Register on Feb. 13, 2001 expresses the frustration of users with the fact that Google had not maintained access to Deja's user interface and online archive until they got their own software developed. Subsequent articles in The Register on Feb. 14, 2001 and Feb. 15, 2001 included comments by Google's CEO Larry Page about why Google had not maintained the Deja archives online. He promised some would be back online in a month and the rest in ninety days. Others in the Usenet community expressed their relief hearing of the purchase of the 1995-2001 Usenet archive by Google. They felt Google had developed a good web search engine. This apparently gave them confidence that Google would be able to create a good user interface for a Usenet archives as well. They urged giving Google time to show what they would do. Research Origins of Google Web Search Engine A report at the National Science Foundation in 1999 explains that "the 'Google' search engine was developed by Hector Garcia-Molina's group at Stanford as an outgrowth of the Digital Libraries Initiative (DLI) project." The development of the Google search engine was carried out as part of a DLI research project at Stanford University in California. Several of those connected with this project are now working at Google either as technical advisors or as employees. In a paper presented in 1998, Sergey Brin and Lawrence Page, at the time Stanford graduate students in the DLI project, describe the rationale for design decisions for the Google web search engine. Their paper The Anatomy of a Large-Scale Hypertextual Web Search Engine describes the recent commercialization of the Internet and the harmful effect this has had on the quality of web search engines (some of which were originally developed with NSF funding). "Up until now most search engine development," they write, "has gone on at companies with little publication of technical details. This causes search engine technology to remain largely a black art and to be advertising oriented....With Google, we have a strong goal to push more development and understanding into the academic realm." Later in the paper they describe another objective of their research. They write: Another goal we have is to set up a Spacelab-like environment where researchers or even students can propose and do interesting experiments on our large-scale web data. A design goal for Google was as a public research web search engine to provide a laboratory to pursue web search engine research. This 1998 paper also discusses how the proprietary activities of commercial enterprises do not facilitate the research and sharing needed to develop web search engine technology. The paper includes an acknowledgment of the funding of the Stanford Integrated Digital Library Project by the NSF, DARPA, NASA, and Interval Research, and the industrial partners of the Stanford Digital Libraries Project. What has happened to the goals expressed in this 1998 paper describing the design rationale for Google? Instead of a publicly developed search engine for research into web search engine design, the authors of the paper have formed a start up company. They are now the President and the CEO of Google. Several of those in the Stanford University digital libraries research community are involved in the company. Stanford University is among the investors providing the funding for the company. Describing such developments in testimony before a House Appropriations Subcommittee, the director of the NSF, Dr. Rita Colwell explains that the "transfer to the private sector of 'people' first supported by NSF at universities should be viewed as the ultimate success of technology transfer." She cites Google as the company which "is an excellent example of knowledge transfer from NSF investments in people." Formerly U.S. law required research done at government expense remain in the public domain. Has this requirement been changed? How is it that a publicly funded research project is the basis for a private corporate start up venture by the researchers, their professors and their university? What is the effect on the nature of basic research funding when the fruits of its development are privatized by the researchers and their university along with the corporate partners to the venture? How does this transfer of researchers and their research from the academic sector to the private sector affect the goal of Google research to provide an open process to support research development of web search engines? Examining what has happened in the acquisition by Google of the Usenet archives and software from Deja will perhaps provide some insight. Responding to a question about why Google bought a Usenet archive, Craig Silverstein, a former Stanford graduate student and now director of technology at Google explained that the mission statement of the company is "to organize the world's information, making it universally accessible and useful." He describes how Google planned for a number of months to add Usenet data to its search engine databases and over the past 6 months this goal became more and more a topic of conversation at Google. According to Silverstein, Google started a conversation with Deja about the archives. However, after Deja sold off part of its company, the opportunity became available for Google to acquire the Usenet archive data rather than license it. No one at Google has revealed what Google paid Deja for the Usenet archive. Considering the goal of encouraging the sharing of research about web development that marked Google's early development, the process of internally deciding to purchase a Usenet archive rather than any obvious discussion with the online community suggests that the company's foray into the private sector has involved them in a similar black art that they observed as a problem of previous search engine development. Online Petition to Deja about the Usenet Archives Silverstein raised the question of why there was only one such Usenet archive. Also he said he wasn't aware of the online petition signed by more than 3850 users to urge Deja to maintain the Usenet archives or to transfer it to a reliable organization, preferably a public or nonprofit organization, if Deja could no longer maintain it. Many of those who signed the petition included comments with their names. This public online petition contrasts with the internal discussion and negotiations that Google carried out to acquire the archive from Deja. That those involved in the acquisition at Google did not have an idea of the concerns of the online community suggests there is a communication problem between Google and the online Usenet community. Whether there are other archives of Usenet posts during the 1995-2001 period is not at the moment known. Steve Bacher is one of those who signed the petition to Deja. Comments from users like Steve Bacher are included in the petition. These provide an understanding of why more people didn't archive Usenet during this period. Bacher describes how he used to maintain an archive of Usenet at his site but that he came to rely on the Deja archive and discontinued his own, telling those who had used his archive to use Deja. Another comment in the petition, by Ofer Even-Tour notes that Alta Vista had an archive that was discontinued. Ofer writes: "I wish Alta Vista would bring their Usenet Archive back." Recognizing the problem of relying on one entity to archive Usenet, Paul Shaffer writes: "Who was sleeping when DejaNews became the choke point of Usenet history???" Reading the comments in the petition helps to provide an understanding of the importance of access to users of an archive of Usenet posts. Also several of those commenting propose the conditions they feel will be necessary to continue such access. Among those signing the petition is Theodor Holm Nelson, author of the book Computer Lib. He writes: "This archive is a public resource which has slipped into private hands. It must be kept available for the public benefit." In a similar tone, Kay Marquardt explains that "the content of the Usenet archive is public content." Such concerns lead Lee Randolph to write: "where is the Andrew Carnegie who will endow 'free public search engines' for the new century?" Considering the problem of how to maintain such an archive responsibly, Calfin Ostrum writes, "If it had been known that you would remove forever access to the Usenet archives, some other more public-minded organization would have come into being to preserve them. Like it or not, you have implicitly assumed a responsibility to provide these archives and you are going back on it. If you don't want to continue to provide them, you should 'fess up' to it and then arrange to transfer them (for free) to whatever organization offers to take them. The Usenet archives are a major repository of a non-trivial part of contemporary culture." Others point out that since an archive provides a public benefit it needs government support and funding. Ray Normandeau writes, "Maybe Government grants should be requested for upkeep." Echoing this sentiment, Brian McNeil explains that the "USENET archive... should *never* have been in private /corporate hands... give it to an appropriate educational establishment." David McRitchie writes that if Deja could no longer continue the archive, it should be turned over to the U.S. Library of Congress as a working system. Robert L. Collins explains, "The Usenet power search power tool is invaluable to me. I use it more than any other link. If you can't find a way to make it financially viable, then perhaps you should spin it off as a non-profit and seek grants. It is a public good... government funding is appropriate." Describing the value of the archive Kalle Valo comments: "Deja's news archive is essential part of Internet. Whenever there is a problem, news archive almost always has a solution. And even in many languages." Considering the future online community, prompts Lee Coursey to write, "Future generations of Netizens will need this." Since feeds of Usenet posts are sent to news servers at participating sites with new posts being added by users at the sites and older posts expired by sites, a Usenet archive can be compared to an ongoing accumulated global conversation. To determine how to archive such conversation is a research problem that some in the Usenet community feel requires a community approach. A post on the website slashdot.org generated a heated discussion about whether it was desirable to have the code for the user interface to the Usenet archive as open source. Also there was discussion about whether Google should make copies of the archive available to those who desired a copy. The slashdot.org discussion was a response to an article that appeared on the Wired website on Wednesday, February 21 proposing that Google provide a copy of the archives data to be maintained as a distributed system on the computers of a number of different universities. The article also proposed that Google open source its user interface so those in the online community could explore how to improve it. This proposal echoed a proposal made in The Register on February 13, 2001. Andrew Orlowski wrote: But perhaps something as valuable as Usenet -- the words of ordinary Internet users -- is never going to be safe in private hands. Why not return it to its roots? The Library of Congress could administer the archive, and ensure it was a properly distributed system farmed out to the best Universities, who could produce ever more cunning hackish search tools? That's not as much fun as shooting lasers at rockets, of course, but a lot cheaper. Users on Mailing Lists and Newsgroups Discuss the Problem There has also been discussion of what would be an appropriate way to maintain the Usenet archives on several mailing lists. One such discussion took place on the Community Memory mailing list. Some on that list volunteered to try to find an appropriate academic or non-profit institution to maintain the archives. One such possibility proposed was the Metalab ibiblio.org project at the University of North Carolina in Chapel Hill which was formerly known by the name sunsite. Sunsite was the name of the site, they explain, because they were originally funded by Sun Microsystems and still are along with other corporate partners. But they wanted a vendor neutral name to reflect the general nature of the information they archive. Another possible site proposed was the Computer Museum in California. A subscriber to the mailing list reported that he tried to contact Deja to inquire about the possibility of a copy of the archive going to the Computer Museum, but his inquiries did not get any response. The newsgroup alt.fan.dejanews provides a forum on Usenet for discussion of what is happening with the Usenet archive. Several users discussed the difference in culture between a corporation which has an obligation to view a Usenet archive as a way to earn revenue and the needs of Internet users for whom Usenet and the Internet are an important means of communication unrivaled elsewhere in the world. In a post, William S. Kossack describes his experience participating on Usenet and the implications of this experience toward understanding the nature of Usenet and the Internet. He writes: If all we did was read archives then the internet would die tomorrow. The internet is about communication. It's about the guy in the outback that knows something about the software your using that nobody else does. It's about the guy with a different native language that needs help on a research problem. It's about the guy down the street that needs help finding someone to really fix his car. It's about people and communication between people that don't know each other and will probably never meet each other. I've worked on problems where the best expert or at least the one willing to help lived in the outback. I've solved research problems where everyone working with me either lived in a non-English speaking country or at least in Chicago. I've gotten answers to car problems, camera problems, computer problems, health problems, and even met my wife via the net. The internet is not about archives it's about communication. It's about communication on a scale not possible by any other means. Kossack's post poignantly characterizes the nature of the discussion and human-to-human computer- facilitated interactions which are possible because of Usenet and the Internet. Will this Culture Clash Affect Usenet? What will be the effect of putting a Usenet archive again under the constraints of the income producing requirements of a corporation? Will this affect the precious human-to-human communication that Usenet and the Internet make possible? If Google is willing to provide copies of the archive to university sites or other non commercial institutions, would this be helpful in making it possible to establish a form of user interface and archive access that support the continued growth and spread of such human communication? While there has been broad ranging discussion in the online community about what should happen if Deja could not maintain a Usenet archive and much sentiment toward having the archive provided with a home with an academic or noncommercial institution, a decision to buy the Usenet archive was made internally at Google without any input that is obvious from the Usenet community. The lack of communication between the online community and Google on the considerations that are important to take into account in determining the future for a Usenet archive is an example of the culture clash that Google's purchase of this Usenet archive suggests. Another aspect of this culture clash between the online community and Google relates to any claim of Google to own the content of a Usenet archive. The postings on Usenet are different from much of the content of the web. While Google is indexing and providing means of searching the web, it does not claim to own the web pages or information it is indexing. With regard to a Usenet archive, however, the offer to license or purchase Usenet posts for a fee or to claim rights to ownership of the posts, is contrary to the understanding of users and their intention with regard to their Usenet posts. In general, those who post on Usenet consider their posts to be contributed to facilitate communication in the online community. Any company's claim that it has a right to buy or sell a compilation of Usenet posts presents a serious challenge to this understanding which has made it possible for Usenet to function over the years. In his article Net Cultural Assumptions, first posted on Usenet in 1992, Gregory Woodbury stresses that people who post on Usenet are doing so recognizing that "folks on different machines *desire* to share information in an easy and timely manner, despite the spatial separation between them and the machines they are using. That is the persons using the Net to communicate *want to communicate* and are willing to cooperate in effecting that communication." That is the unwritten agreement. How Woodbury would feel about a company putting a copyright on those communications and calling them their property to be bought and sold, is not the subject of his article. But what effect will it have on Usenet when posts the online community has contributed for the purpose of communication are claimed as the property of commercial entities? As Woodbury argues, those posting on Usenet in general consider that their posts are contributed to facilitate communication among Usenet users. Any company declaring that it has the right to the ownership of these posts, or to buy or sell a compilation of such posts, presents a serious problem for Usenet users and for Usenet's continued development. Their actions can have a chilling effect on those who make the contributions. In general, posts are covered by the Berne convention, agreed to by many countries, and which the U.S. joined on March 1, 1989, protecting the right of the creators of the posts to their copyright. The Berne onvention provides that once a work or idea is fixed in a tangible form, the creator holds the copyright to the form. No or other notice is required for the copyright status. Users do not need this protection when they are contributing to communicate. Nevertheless, this copyright is a protection against any other entity gathering their posts and claiming ownership or the right to financially benefit from the copyrighted work of others, without the explicit permission of the contributors. Whether Google paid money for the Usenet archives is not known, since they have not made the details of the transfer from Deja to them public. However, a spokesperson for Google has said that the company will consider the request to make a copy of the archive available to a nonprofit or public entity and that proposals can be sent to bizdev@google.com Tom Truscott, one of the co-originators of Usenet, provides a bit of a different perspective to understand the challenge the transfer of the Usenet archive presents to the community. He points out that those at Deja who developed the archives and the code for the user interface spent a long time thinking and working on them, and for most it must have been a labor of love. He suggests that creating a new user interface or search software for the archives will require that technical decisions be made which will require an understanding of Usenet and its nature. For example, he writes: 1) citing a Usenet article -- When I reference a Usenet article, I use the magic URL that Deja supplies for it. I have found them to be valid indefinitely. At least, until about a week ago. Will Google continue to supply permanent URLs? I sure hope so. 2) Ranking Usenet articles -- I haven't tried the new Google/deja search yet, but I've heard it doesn't track "threads" any more. Technically, this is quite important, as Steve Bellovin pointed out in http://www.theregister.co.uk/content/6/16888.html A thread represents an interactive discussion, and so presenting the thread together and in order is good. But there is another way that Usenet searches can exploit threads. Usenet articles are more transitory than web pages. But "followups" to articles which create Subject threads, permit a limited variant of PageRank [Google's ranking scheme for web pages-ed] Describing the differences between web technology and Usenet technology that are relevant toward how one will do a search, he writes: 3) Searching Usenet articles -- When doing a text search, google considers matches in the web page to be more important than elsewhere, and text in a large font is more important than text in a smaller font. A Usenet article does not have a <title>, but it does have a Subject: field. Usenet articles often contain "included text" which should be considered less important than original text. He summarizes, "So, there are significant differences in the ways that pages/articles should be cited, ranked, and searched," and he asks,"Does Google plan any improvements, for Usenet articles, in any of these areas?" Truscott's comments are helpful in conveying how the level of understanding of Usenet will impact the design decisions that Google or anyone else who designs software for a Usenet archive makes. There is another question, however, raised by the transfer to Google of the Usenet archive. This is the question of how important is it to maintain and develop the collaborative online community? How important is it to encourage cooperative contributions to a common pool of technical knowledge, software code, tools and other social forms that the new online community has developed? J. C. R. Licklider is recognized as the visionary who inspired the development of a the worldwide network of networks. In articles he wrote in the 1960s and after he explains why it is crucial to foster a collaborative online environment and contributions by users to a common pool of technical knowledge. The research on time- sharing that Licklider supported when he first went to ARPA in 1962 set the foundation for such a cooperative community. The early collaboration between the different Centers of Excellence that Licklider set up at universities in the U.S. were the basis for the research to create the ARPANET. The creation of the ARPANET continued the development of this cooperative community. The ARPANET mailing lists begun in the 1970s supported the cooperative communication that continued to develop. Usenet grew up in the early 1980s by building on the experience gained by those who had participated in the ARPANET mailing lists and by linking up with the ARPANET mailing list community. Together Usenet and ARPANET technical pioneers formed a vibrant online cooperative community and created a common pool of technical knowledge. They have given the world contributions as varied as the Requests for Comment (RFC's) and Unix tools. Even more important perhaps has been the ability of the online community to work together to solve the difficult problems of scaling computer technology and computer networking. Usenet and the Internet are crucial supports in making it possible for researchers to collaborate to understand and then solve the problems these developments present. The problem that the online community is faced with is how to continue its collaborative communication and contributions? Do they need some broader support from academic institutions and governments toward this end? Isn't it a loss if research objectives are ended and the resources used to develop commercial enterprises as happened with the 1998 design objectives for the Google web search engine? Isn't there a need to find a way to support and encourage the integrity of the research community so that they can resist efforts to turn them and their endeavors into products for investor speculation? Those who are technical employees of private corporations will especially need a vibrant online collaborative community to help them overcome the difficulties that functioning in a proprietary environment brings. Vibrant and functioning Usenet newsgroups and Internet mailing lists can help with these challenges. But what will it mean to the online community if these essential communication processes are curtailed or declared the private property of someone? This is one of the challenges now facing the online community. This is one of the questions raised by the sale by Deja of the contributed posts of the Usenet community, and one of the questions raised by Google's buying these posts and suggesting that they have a property right to own them and to trade them. How this dilemma will be resolved will be determined by how seriously the online community treats it. The petition to Deja and the various discussions both on Usenet and on mailing lists suggest that there are those in the Usenet community who recognize the importance of the situation. -------- This article originally appeared in Telepolis: http://www.heise.de/tp/english/inhalt/te/7013/1.html It is reprinted with permission. *The company DejaNews had collected the posts on Usenet from 1995 and had a search engine to search for them at their web site. Several months ago DejaNews changed its name to Deja.com and limited access to the Usenet archives it had collected to posts from the last year. ---------------------------------------------------------------------- [9] John Locke and the Privatization of the Internet by Jay Hauben jrh29@ais.org For historical reasons, the U.S. government has overseen the technical development of the Internet from its beginning in the early 1980s. At least since 1997, there has been an effort by the executive branch of the U.S. federal government to privatize the essential, central functions of the Internet.(1) I want to investigate the Internet and its proposed privatization. I will be guided by the analysis of John Locke (1632-1704). Locke analyzes the questions of property and privatization in Chapter 5, "Of Property" of his The Second Treatise of Government (c.1680-1683).(2) Locke considers the original state of human society before the creation of political institutions. He assumes that the earth and its resources were available "to mankind in common". He wants to show "how men might come to have a property in several parts of that which God gave to mankind in common."(sec 25) This raises for me the question, what is the original state of the Internet and what might privatization mean and do to the Internet. By privatization of the Internet I mean the ending of the governmental fostering and oversight that have been in place from the beginning of the Internet(3) and replacing them by control of the crucial functions by non governmental, private entities. The Internet is still young, less than 30 years old. There are disputes over what the Internet actually is and over its history and potential impact. It is considered by some to be wires and routers and perhaps protocols and the domain name system. For others it is the interconnection of all computers using a particular set of communication agreements called the Transmission Control Protocol/ Internet Protocol suite (TCP/IP). It is important for my investigation that I state my understanding of what the Internet is. The Internet from its beginning and still today is an interconnection of diverse, independent, packet switching networks. A computer network is an interconnection of computers. The Internet is a meta network, something in addition or "above" its component networks. There is a significant difference between interconnecting computers, no matter how different they are and interconnecting networks. Computers may differ in operating systems, character sets, hardware, etc. These are all technical aspects. The networks that interconnect to make up the Internet each have their own purposes, system administrations and architectural principles. They have been set up by different political and economic administrations to serve different functions. The Internet was designed to solve the problem of sharing resources and communicating among such diverse networks while respecting their differences. Fundamental to the Internet is its principle of open architecture which requires respect for the autonomy, purpose and local sovereignty of the networks that become part of the Internet. The technology of the Internet is based on the successful development of computer time-sharing and packet switching technologies. The impetus behind the development of time-sharing and then packet switching was for greater accessibility of computing. There was a fear among some scientists and engineers that the power of computer-aided decision making would otherwise be concentrated in too few hands.(4) There was also the expectation that a great benefit would arise when computer resources and user created content would be sharable by a large community of users. The guidance and investment that made the original networking and internetworking research possible and lead to time-sharing, packet switching, protocol development, the original hardware and software, leasing of long telephone lines, was all public. It was leadership from scientists under government contract and public money mostly U.S. but also British, French, Norwegian and NATO public money, some supplied via military budgets. The original vision guiding these developments was to unite communities of human beings via computer communications into an Intergalactic Network as J.C.R. Licklider called it, a vast human-computer symbiosis.(5) The developments that have made the Internet possible were achieved by an international collaboration among scientists fostered by a public administration of research projects encouraging openness and resource sharing. Just as Locke sees the original state of people as sharing the resources of the earth in common, I take this public funding, original social purpose, and cooperative origins to have created an electronic, public commons.(6) For Locke, things in common are only valuable if they can be used for "the support and comfort" of people. And all such things should be available for the needs of all people. But does not the use of some of those things to satisfy the needs of one individual require the making of common things into private things (i.e., privatization)? Locke resolves this apparent difficulty by arguing that in the original conditions of human society the things of the earth were plentiful so that "he that leaves as much as another can make use of, does as good as take nothing."(Sec 33) Personal use of things in this stage still leaves the whole as a commons. "Nobody could think himself injured by the drinking of another man, though he took a good draught, who had a whole river of the same water left him to quench his thirst."(sec 32) Such use leaves the commons intact. Use of the Internet does not require a taking from it either. There is no less of the Internet after I use it than before. That is because digital resources are not diminished when they are copied. And the technology of the Internet is such that no simple use of the Internet adds any but an insignificant cost to anyone. Each network provides its resources for its own users but at the same time those resources become available to the whole Internet at no necessary new cost. There are enough resources on the Internet and many people who use it, far from taking, are actually contributing something by their use like posting an opinion or answering a question. For the moment the only result of many other people using it at the same time as you, is that you may experience a slightly longer delay than if the others were not using it. But such delay is a result of the early stage of technological development. There is every reason to believe that significantly greater capacity and interconnection are technically possible. So just use or increasing the universality of access to the Internet does not diminish its common or public nature or utility to those already using it. Also, the Internet has the character of a common good. For Locke, in the original state of human society the resources of the earth that are needed for food, clothing or shelter are common goods. Already to some extent but potentially to a much greater degree, the well being of each person will also depend upon the quality of his or her access to the Internet. What Licklider and Robert Taylor wrote in 1968 is closer to reality now: ... Life will be happier for the on-line individual because the people with whom one interacts most strongly will be selected more by commonality of interests and goals than by accidents of proximity. ...For the society, the impact will be good or bad depending mainly on the question: Will 'to be on line' be a privilege or a right? If only a favored segment of the population gets a chance to enjoy the advantage of 'intelligence amplification,' the network may exaggerate the discontinuity in the spectrum of intellectual opportunity.(7) Licklider and Taylor are arguing that to be online will be essential to a full life. If something like the Internet is crucial to human "support or comfort" it is a common good and in the view of Locke all people have a common right to be included in its use. As long as people gather from the commons what they need for their own use, Locke says they have a right to what they gather. "Whatever is beyond this, is more than his share, and belongs to others."(sec 30) He argues that small populations and gathering for use gave rise to "little room for quarrels or contentions about property so established."(sec 31) Locke considers property not in the modern sense of private property and ownership but in the sense of having the right of use. Even improvement of a part of the commons through the application of the labor of an industrious person leaves enough for others. Therefore, as long as his product is for the use of his family and not allowed to spoil, the industrious person has a right to his cultivated area. Likewise, the component networks of the Internet can be developed and improved as much as their local owners and administrators want based on whatever principles they choose without encroaching on the commons of the Internet. What constitutes the commons of the Internet does not get used up or diminished by such improvement but in most cases gets augmented by it. The original goal of packet switching networks was resource sharing. The Internet carries resource sharing beyond the individual network to the whole community of Internet users. The increase in resources on any one network is an increase in general of the resources for all users. The Internet has been designed and developed based on the principle of open architecture. That means the Internet makes the most minimal requirements possible on the networks that it interconnects. What is required, i.e., what is in common, is the agreed upon protocols (the TCP/IP protocol suite) that allow for the sharing of resources without intruding on the local sovereignty of the component networks. The Internet protocol (IP) creates a pool of unique numerical addresses, currently 4.3 billion. Each network administration which adopts the TCP/IP protocol suite and arranges for Internet connectivity needs to receive a range of these unique addresses for use by computers within its network. As long as a mechanism is in place that insures equitable distribution of these numerical addresses the Internet can continue to function and grow. The TCP/IP protocol suite and these IP numerical addresses are the main technical aspects of the Internet commons. But what makes the Internet attractive and crucial in the lives of people are the other people. The Internet makes more people than ever before in essence shared resources for each other. And these people make available or point to still other resources available via the Internet or off line. This communication and the transfer of files and information never leaves anything less for other users. The protocols and numbers and what they require are what the Information Processing Techniques Office (IPTO 1962-1986) of the U.S. Advanced Research Projects Agency (ARPA) originally fostered and funded. These protocols and numbers along with all the people and other shared resources are the commons of the Internet. There are also parameters known as port numbers that are common to users of the Internet. And there is at present a domain name system that is a common way to map names to the numerical IP addresses. From the beginning, the U.S. government has overseen the distribution of the pool of IP numbers and the protocol development process. These along with the oversight of the Domain Name System (DNS) and its central root server system are precisely what the executive branch of the U.S. government is trying to privatize by creating the Internet Corporation for Assigned Names and Numbers (ICANN, currently a so called non-profit, private corporation registered in the U.S. state of California). Locke has taken us as far as to see that, in his analysis, all people at the origins of human society had a right not to be excluded from the things of life. The use by one person of the things in common did not exclude another's use of as many things as he or she needed. The right of use was not a right of abuse or alienation. When one's labor improved something, that for Locke gave the laborer the right to a property in what was improved, as long as there remained in the commons other resources for other people's use. That is, the person who added his labor to something had a right to exclude others from taking or using it. For Locke the question of privatization is the question of the right to exclude. But privatization is not unlimited. It is only permissible if there remains enough of the things of life that no one is without what is needed. At a first glance it is not clear that the privatization of the Internet's essential or common functions is also a question of exclusion and loss to some people. But putting these functions in private hands means that the oversight and administration of the IP number distribution and protocol development process can no longer be at public expense. So who will pay for these and the other added costs that come with the payment of, e.g., a private board of directors? The costs in private situations always get passed onto the end users. Thus use of the privatized Internet will inevitably have a higher economic barrier to scale than if the commons of the Internet remained publicly overseen and administered. Also, by its very nature, private control of the development of the Internet will be focused on different objectives than has been the public direction. That can be seen already where the private sector sees e-commerce as more attractive as a goal than universal access to a global communication system. Both higher cost and different purpose will tend to exclude many people from Internet usage or so change the content that the universal value of the Internet will be lost. Locke argues that, "there is land enough in the world to suffice double the inhabitants, had not the invention of money, and the tacit agreement of men to put a value on it, introduced (by consent) larger possessions and a right to them."(sec 36) After the introduction of money there was eventually no longer the same plenty for all, and then exclusion led to quarrels and contentions. The gathering into cities and the introduction of money made efforts to exclude more attractive to some and required a response by society. That response was the introduction of political institutions or civil society in replacement of the natural state of society. "The several communities settled the bounds of their distinct territories, and, by laws, within themselves, regulated the properties of the private men of their society, and so, by compact and agreement, settled the property which labour and industry began."(sec 45) What is the fate of the commons in political society? Locke points out that "in England or any other country, where there are plenty of people under government who have money and commerce, no one can enclose or appropriate any part [of the commons] without the consent of all his fellow-commoners; because this is left common by compact- i.e., by the law of the land, which is not to be violated." The consent of all the commoners is necessary because "after such enclosure, [what is left] would not be as good to the rest of the commoners as the whole was, when they could all make use of the whole."(sec 35) And political society and laws are necessary so as to enforce the receipt of that consent before, not after, the proposed enclosure that will deny many some previous benefit. Of course the Internet has developed in an era long after political society has taken deep roots but also especially at a time when money and commercial considerations play a dominant role. That is why the need for a public role in the Internet is so great. As with the English Commons, if the general public interest is not protected, the particular private interests will significantly diminish what is available to the rest of society. The tension between common purpose and private exclusion exists and will cause contentions and quarrels which require procedures in accordance with law for their resolution. By Locke's reasoning, for the common aspects of the Internet to be privatized the consent of all the Internet commoners is necessary. But who are the Internet commoners? It would seem that the Internet users should be considered the Internet commoners. The development and expansion of the Internet to all parts of the world originally had been for public use and not for private profit. The Internet is composed of diverse, independent and sovereign networks each embedded in a political society determined by geographic location. The question of whose consent is needed to make decisions like whether or not to privatize the essential functions of the Internet needs to be studied, debated and acknowledged as important. But at a minimum, the system administrators, political representatives and the users themselves seem to require central roles in such decisions. And the Internet itself seems to be giving rise to a new political and social phenomenon, the netizens, those Internet users who take a responsibility to spread and safeguard the development of the Internet. Enclosure of the commons according to Locke begins the process that leads to the need to establish national borders. How does this apply to the proposed privatization of the Internet? Locke reminds U.S. that questions that effect people across national borders are solved by "leagues that have been made between several states and kingdoms, either expressly or tacitly disowning all claim and right to the land in the other's possession, ... and so have, by positive agreement, settled a property amongst themselves, in distinct parts of the world."(sec 45). The Internet is a commons that reaches across national boundaries. Its spread too was facilitated by agreements but these took the form of Acceptable Use Policies (AUP). For example, the U.S. National Science Foundation prohibited commercial use of the NSFNET and only allowed interconnection with other networks that had a similar policy. It seems likely, following the analysis of Locke, that the continuation of the growth (scaling) of the Internet will require agreements or treaties among leagues of the nations involved. Such agreements are the opposite of privatization. They are in a sense a public internation alization in recognition of the global reach and importance of the Internet. When Locke analyzes why privatization is a problem for society, he envisions an isolated island where there is nothing "fit to supply the place of money..." He asks "what reason could anyone [there] have to enlarge his possessions beyond the use of his family ...?"(sec 48). Unless there were hopes of commerce with other parts of the world to draw money to the enclosure by the sale of products, it would not be worth the enclosing. The Internet is a wonderful global electronic commons. For its own sake it does not appear that anyone or any organization would want it as a private possession. So the questions need to be raised: To whom or to what will the benefit of privatizing the Internet accrue? And what role are such forces playing in bringing about for example the creation of the Internet Corporation for Names and Numbers (ICANN)? This private corporation, under a board chosen in secret, has been working since November, 1998 to take over the functions of the Internet still under the oversight and management of the U.S. government or its contractors. In over two years of ICANN activity the secret of how this board was chosen, by whom and for what purpose is still impenetrable. Secrecy is a clue that these are important questions. The other clue is that the supporters of privatization have never been willing to discuss or debate the question of why or how privatization might serve the general welfare. They say commercialization will spread the Internet but they are not willing to allow debate over this question in the media they control. Guided by Locke's theory of property, my conclusion about the privatization of the essential functions of the Internet now being attempted is that the commons of the Internet should be protected not privatized. Locke suggests that this protection is the responsibility and obligations of governments in league with each other. He writes, "For in government the laws regulate the right of property, and the protection of the land is determined by positive constitution."(sec 50) The history so far of the Internet suggests that acceptable use policies and voluntary gatherings of network administrators with online forums might also play a crucial role. However at present it is especially the U.S. government that has the obligation and responsibility to protect the Internet commons since the central functions of the Internet are for historical reasons under its supervi sion. The U.S. Constitution has in the Preamble six purposes for which the U.S. government is established: 1) to form a more perfect union, 2) to establish justice, 3) to insure domestic tranquility, 4) to provide for the common defense, 5) to promote the general welfare, and 6) to secure the blessings of liberty to ourselves and our posterity. The development of the Internet falls under at least 1 and 5. The privatization does not seem to fit any of these six purposes allowed to the U.S. government by its own constitution. The fundamental purpose of the U.S. government would appear to be to promote the general welfare. Locke agrees that all that governments do must be "only for the public good."(sec 3) My analysis indicates that the privatization of a commons is in general not for the public good. So that the efforts of the U.S. government executive branch to privatize the essential functions of the Internet are inappropriate. The U.S. General Accounting Office has also warned that if privatizing the crucial functions of the Internet involved the transfer of any public property it would be contrary to the U.S. Constitution. For Locke, the legislative is the supreme power (sec 132) not the executive. In the U.S. there is a law, the Government Corporation Control Act of 1945, which prohibits the transfer of government functions to corporate entities like ICANN without specific authorizing legislation. Presently there is no such legislation and there have been questions from the U.S. Congress concerning the privatization and the lack of appropriate authorization to do it. Locke points out that should ICANN get "into the exercise of any part of the power, by other ways, than what the laws of the community have prescribed, [it would have] ... no right to be obeyed"(sec 198). That is because it would not then be the body the laws have appointed, and consequently not the body the people have consented to. In such a case, even if some network administrations obey ICANN, there will be others for local reasons that will not. Then fragmentation of the Internet is a likely result. Private property in the analysis of Locke is not a natural right but a conventional right based on civil law. A commons necessary for the well being of a people needs to be protected from becoming private property. Locke has an answer if the U.S. Congress and executive branch and the other governments of the world allow the privatization of the Internet commons. "Whenever the legislators endeavor to take away, and destroy the property of the people, ... they put themselves into a state of war with the people, who are thereupon absolved from any further obedience." (sec 222) So at least in Locke's analysis, the result of privatizing the Internet on which people's lives are coming more and more to depend will be a greater instability in society. We may not have come that far yet but my reading of Locke puts the privatization of the Internet commons in line with the other violations of the public purpose of government that more and more characterize our time today at the beginning of the 21st Century. ------------ Notes 1. See for example, "Management Of Internet Names and Addresses," (63 Fed. Reg. 3, 741-42, 1998), the White Paper issued by the U.S. Department of Commerce, June 5, 1998. To achieve its privatization the U.S. Executive Branch set up in Fall, 1998 a private corporation, the Internet Corporation for Assigned Names and Numbers (ICANN). But as of 2001 the central functions of the Internet are still under the oversight of the U.S. Department of Commerce. ICANN is acting in a way as IANA before it as a government contractor. 2. My quotes from Locke are from the Everyman edition of Two Treatises of Government, edited by Mark Goldie reprinted in 1998. I indicate after each quote the section in "The Second Treatise of Government: An Essay Concerning the True Origin, Extent, and End of Civil Government" from which it is taken. My analysis has benefitted from a reading of A Discourse On Property: John Locke and His Adversaries by James Tully, Cambridge University Press, Cambridge, 1980. I take from this reading the understanding that Locke used the unqualified word 'property' to mean the right to use of, not ownership in the modern sense of property. Then the things of the commons can be the property of someone in the sense that he or she has the right to use them to the exclusion of other people's use as long as there are still in the commons resources to meet the needs of the other people. 3. The role of the U.S. government in the chain of events that lead to the Internet started with J. C. R. Licklider's vision and his creation of the IPTO in 1962. Some of Licklider's vision can be understood from the papers noted below in notes 5 and 7. See also Ronda Hauben's work on IPTO, the Advanced Research Project Agency office created by Licklider (work in progress). 4. For the documentation of these concerns among the scientific and engineering community see Computers and the World of the Future, edited by Martin Greenberger, MIT Press, Cambridge Ma., 1962. This book of lectures and discussions from MIT's Centennial Celebration in 1961 contains the keynote address by C.P. Snow and contributions from many of those who went on to play significant roles in the development of time-sharing, packet switching and networking. For an analysis of the impetus for these developments see Chapter 6, "Cybernetics, Time-sharing, Human-Computer Symbiosis and Online Communities: Creating a Supercommunity of Online Communities," in Netizens: On the History and Impact of Usenet and the Internet by Michael Hauben and Ronda Hauben, Los Alamitos, Ca. IEEE Computer Society Press, May 1997. Also this theme was explored in "The Internet: History, Technical, Principles, Social Impact", an Horizons mini-course, Columbia University, Spring 1999 and Spring 2000. 5. See, "Man-Computer Symbiosis", in IRE Transactions on Human Factors in Electronics HFE-1, March, 1960, pages 4 to 11. Also reprinted in In Memoriam: J. C. R. Licklider 1915-1990, Aug. 7, 1990, p. 40, Digital Research Center. 6. See Michael Hauben, "Preface", in Netizens: On the History and Impact of Usenet and the Internet by Michael Hauben and Ronda Hauben, Los Alamitos, Ca. IEEE Computer Society Press, May 1997 . 7. In Memoriam: J. C. R. Licklider 1915-1990, Aug. 7, 1990, p. 40, reprinted by Digital Research Center; originally published as "The Computer as a Communication Device," in Science and Technology, April, 1968. ---------------------------------------------------------------------- [10] MsgGroup Part V Questioning What Should Be Discussed by Ronda Hauben ronda@panix.com [Editor's Note: The following is the last installment of this article. The whole article can be accessed at: http://www.ais/org/~ronda/new.papers/msghist.txt ] Not surprisingly there were managers at Xerox who were not happy about the kind of frank discussion ongoing on the ARPANET mailing lists. A post by David Liddle, Vice President of the Office Products Division at Xerox explained his reluctance to have Xerox products discussed by Xerox employees on the ARPANET (63): Many of you in Xerox are aware of a newly created ARPANET distribution list named Apollo. It was established to promote discussion of personal workstation computers. As you might expect, much of the recent discussion has involved the Xerox 8010 Star information system. Because many of the messages ask for information about this product and its associated development software, you may feel tempted to reply to some of them. It is ARPA policy that the ARPANET be used only for government supported research and development. It is against Xerox policy to use the ARPANET to discuss products.... Xerox employees use the ARPANET for ARPA related research purposes only, not for answering questions or distributing information about our products. Questions from potential customers about the Xerox 8010 and other OPD products should be referred to Arnold Palmer, Field Sales Manager, Xerox Corporation, 1341 West Mockingbird Lane, Dallas, Texas 75247, phone (214) 689-6689. David E. Liddle Vice President Office Products Division A response to Liddle's post challenged the reasons he had given for limiting discussion. Lars Ericson at CMU wrote(64): The use of the ARPANET for informal discussion of computer science-related issues is a primary win. It is clear that such discussion is beneficial to ongoing government research projects DARCOM and Office Automation for example, are well represented on the Work Station. Ericson continued: Mr Liddle also seems to forget that the reason PARC efforts are so immensely saleable these days in precisely BECAUSE of their participation and openness (as opposed to IBM, say) in the ARPA/university research community, and not in spite of it. "Mr. Liddle's Xerox policy announcement," Ericson wrote, "represents the sort of irrelevant (to ARPANET interests) administrative miserlyness that we may come to expect from Xerox now that the 13-piece suits have brought PARC to market." Also responding to Little's post, Joe Newcomer emphasized ARPA's policy forbidding commercial use of the ARPANET(65). Joining the controversy, Crocker explained (66): It is my understanding that the purpose of this discussion is to consider the technical aspects of personal work-stations. Arpa and the rest of the military are investing quite a bit of money in this area, so that this discussion would seem to be extremely appropriate to the ARPANET mission. He added: I do not believe that conformance with the ARPANET proscriptions necessarily requires commercial participants to be prohibited from voicing opinions about the technology in general or from answering specific questions about their product. Touting their product is another matter. Crocker's proposal was that, "I suggest that each company assign one technical (not marketing) person to respond to queries. This will permit direct information, while making 'tone-control' easier." Part VI Limited Distribution? Not only was there reluctance on the part of representatives of some commercial entities to have open conversation of all issues on ARPANET mailing lists ported to Usenet, but also there was a sense among ARPANET participants that their contributions should be considered privileged private publications and their distribution strictly limited. A conversation describing this issue developed on FA.digest-p carried on the ARPANET and on Usenet. In January 1982 a post noted that Computer World magazine had gotten copies of the TCP digest from someone and published verbatim quotes from the digest(67). Though the source of the leak acknowledged what had been done and agreed to stop, "it gave everybody a real scare," the post noted. "My temporary solution to this issue," the poster proposed, "is to add the following notice to the Masthead: TCP/IP Digest Thursday, 8 Oct 1981 Volume 1: Issue 1 ------------------------------------------------------ LIMITED DISTRIBUTION For Research Use Only --- Not for Public Distribution ------------------------------------------------------ At least this ensures that anybody who gets fed a copy knows that it is not supposed to be shouted to the treetops. Comments?" Christopher C Stacy at MIT disagreed with such a publication identifier. He wrote (68): I think that the explicit banner on the masthead of the Digest is a bad idea, because this will cause many people to think that if such a banner is NOT present (i.e.., on any other Digests or on future TCP Digests) that it is alright to redistribute the material. In another post, Stacy described his understanding of why ARPANET mailing lists had to have limited distribution. (69) He pointed to an incident that had occurred when MIT had to fight for its continued existence on the ARPANET after an article in the journal Datamation about the WINE-TASTERS mailing list appeared. He also cautioned of the possible liability problems when evaluating and discussing various commercial products, as with the INFO-TERMS mailing list which evaluated terminals. "But laying down the law," he wrote, "is a fairly useless way of solving this sort of problem. The problem is one of awareness, cooperation and trust. Only if people understand and care, will they take steps to protect a fragile institution like the ARPANET," he wrote. Another post noted that the mailing list digests "do not exist as authorized publications." (70) He felt that they should be considered "internal communications between research project members authorized to use the net." A post asking about the implications of the Daniel Ellsberg case to this issue by Mike Muuss was answered by Paul Karger. Karger wrote (71): While putting a restricted distribution statement on a digest may be a psychological limitation on distribution, there are a couple of problems. First, since ARPA and DCA are part of the DoD, there are specific regulations on what may or may not be marked as FOR OFFICIAL USE ONLY. The regulations are in part designed to not let people invent other kinds of markings. This dates back to the Ellsberg case and the desire to limit the ability of government people to conceal information from the "public" (whoever that is). Though Karger said his familiarity with the regulations was a little stale, "I would be very careful about developing new ways to restrict distribution of government information," he cautioned. Through this discussion, concerns for limiting the ARPANET discussions were raised, and answered with the limitations that the current state of relevant law allowed U.S. government officials to impose on the ARPANET mailing list discussions. Thus the way was cleared for broader distribution of the posts on ARPANET mailing lists, making the transition from the limited circulation available on the ARPANET to the broader participation Usenet made possible. Part VII Usenet Welcomes All While access to the ARPANET was limited, Usenet welcomed all who were willing to connect in a public way (72). "Usenet is a public network," wrote Mark Horton, "and those on it should announce themselves." "It seems to be a common thing," he wrote, "for a new site to come upon Usenet without telling anyone they exist." What happens," he explained, "is that someone hears about Usenet from someone already on the net, who sends them their copy of whatever code they are running." He asked, "When you start getting network news, you should announce your existence to the net by filling out the enclosed form and posting it to the newsgroup net.general.... This form will be used as your entry in the Usenet directory. Note," he continued, that is the policy of Usenet that all sites receiving public newsgroups (such as net.all and fa.all) are public in the sense that the fact they are on Usenet is public. The name and phone number of a contact person, as well as the name and location of the site, is important. If you are doing some kind of secret work there is certainly no need to divulge the nature of your work. If you feel that you must keep your existence a secret, you should not be joining Usenet," Horton clarified. A form was provided for a new site to fill in. Horton asked that those joining Usenet post their announcement and basic configuration information in NET.general. NET.general was the one newsgroup that all were on Usenet during this period were encouraged to read (73). "net general," wrote Horton, "is for stuff that everybody is supposed to at least consider reading. "It's useful for INITIAL QUERIES and AN NOUNCEMENTS." However, he noted that "It is NOT there for discussions." He explained, "If you see something in net.general you want to comment on, you should almost always just REPLY to the author, not follow up to the world. If a continuing discussion is needed, start a new newsgroup." He also suggested replying to initial queries from NET.general in NET.misc. "NET.misc," he wrote, "is a good way to keep net.general free of trivia without starting new newsgroups for short lived topics." He urged those on Usenet to realize that not all might be interested in a particular topic but "feel obligated to read things in net.general because of their possible importance." Matt Glickman, who with Mark Horton wrote the code for B News, supported Horton's request for maintaining NET.general as a newsgroup that would concern all. He wrote (74): Just reminding everybody (It feel it is my duty...) that net.general is no run-of-the-mill newsgroup. No sir. It's not net.misc and it's not net.news. net.general should only contain GENERAL interest information of interest to the ENTIRE network. Especially, no dreaded newsgroup discussions whatsoever! Please behave yourselves. Therefore, while the posts on NET.general don't document the interesting discussion carried on on early Usenet, they do convey some of the general concerns and views of the pioneering Usenet participants. Many of those on early Usenet were programmers or system administrators. As such, they are particu larly sensitive to misspellings and other textual and writing errors. In a post on NET.general, one user gathered comments from all interested about concerns about what they considered poor writing that appeared on Usenet. In response, Rob Glaser from Yale wrote (75): It is true that many technical people use the English language sloppily. In an informal setting such as Usenet, however, content ought to be valued over form, time lines over lengthy deliberation. I'd rather see a timely article with a few grammatical mistakes (as long as it is basically coherent) than the same piece, impeccably written but appearing days later. He also observed that the software (inews) for posting sometimes was problematic and helped create the grammatical or other errors one saw online. He wrote: Another factor to keep in mind is that, judging from some of the submissions we receive over the net, the inews submission interface is not always conducive to perfection (not a slap at the news designers, just the incompetents, myself included, who make dumb mistakes.) He then went on to describe how he had had to redo even this post twice before getting it right. "For instance," he wrote, "I messed up two earlier versions of this flame (one of which may have been sent, my apologies if it was) before (*pray*) finally getting things right." Other posts on NET.general included requests for recommendations for buying something worthwhile or complaints about problems users were having with commercial entities to see if others had similar problems or could help. For example, a post by Larry Piovano (76) described how he was planning to buy a color tv with a 13" screen. He asked for recommendations and experiences of others to help him decide which brand to get. "I wish to buy one," he wrote, "that will not die in short order." Bill Shannon from Digital Equipment answered (77), "My 12 inch Sony has been going strong for 10 years with no repairs, no adjustments, no problems! And I'm sure they've gotten better (and more expensive)." A response on Usenet responded (78): Suggest SONY. I have two Trinitrons and they work wonderfully...." The post continued: I have had my SONY for a couple of years now and have had no problem. I suggest you get one with an electronic tuner (no moving parts to wear out). Try the wireless remote control. It's a great toy if your lazy." A similar question about recommendations regarding the Hayes Smart modem was posed by John L. McAlpine in Canada at the Saskatchewan Linear Accelerator. He wrote (79): Use of HAYES Smart Modem 1) I would appreciate receiving comments on the reliability of above modem. 2) If anyone has available the appropriate patches to use this modem with uucp for auto-dialing I would appreciate receiving same. A post by Ron Gordon at Bell Labs (Murray Hill) warned other Volkswagon Rabbit owners of a potential radiator tank leak. He wrote (80): Attention VW Rabbit owners, you may have a problem! The radiator overflow tank on my vehicle developed several cracks which permitted coolant to escape. Because the overflow tank is directly connected to the radiator system without a valve, a leak in the overflow tank is just as bad as a leak in the radiator! He went on to ask if other VW Rabbit owners were having a similar problem. "My tank failed after 16 months at 15,000 miles," he wrote, "Should enough evidence become available, a formal complaint may be filed with VW and the Consumer Protection Agency." Another Usenet poster asked if there could be a consumer forum newsgroup to monitor companies that ripoff consumers. In his post, Randy King wrote (81): What provisions, if any, have been made to provide a sharing of gripes about national "ripoff" companies and the like? I would like to hear some comment on this, as well as see the establishment of a newsgroup (as if any more were needed). It might be very interesting to see what goes [on-ed] out there and to provide readers with some insight to companies so that they may not be smitten by these "invisible stalkers! "What's the feeling out there," his post asked. Responding to an answer by Andy Tanenbaum from Bell Labs about the intent of his post, King wrote that he had had in mind an insurance company, but that the forum could discuss both problematic and beneficial companies. (82) Several of the posts on NET.general suggested creating new newsgroups, such as a post by Linda Seltzer at Bell Labs (83): I would like to start a newsgroup called net.music for communication among composers, news of concerts and conferences, news about computer music, news of good new records, etc. Anyone interested in subscribing to this group please send mail to research.lin or alice.seltzer Linda Seltzer Another post noted that her e-mail was inaccurate in her post, and that it should be alice!seltzer (84). Other posts concerned general questions or problems. For example, Andy Tanenbaum posted about a piece of junk mail he had received from a head hunter who seemed to have gotten his name from the list of conference attendees who attended the previous USENIX winter conference (85). "I DON'T want junk mail from employment agencies," he wrote. "If you want to put up a recruiting note at a USENIX, fine. But as long as I have a means here to express my dissatisfaction, I want it to be known that I look with bad feelings toward companies that badger me by abusing a valuable resource." His post ended, "I wouldn't want a future list of conferees to not have addresses just because some losers bother some of the good folks on the list with junk mail. Don't call us, we'll call you." A post by Jay Lepreau asked if there were any archive of bugs for software that had been posted on Usenet so he wouldn't have to do work others had already done. He wrote (86): Has anyone out there been archiving any of the "net.*bugs" newsgroups or just have old stuff still kicking around? We just joined Usenet around the beginning of November; if anyone has stuff from before that I'd appreciate hearing from you. I'm TIRED of fixing bugs I know have been found and fixed before. I can send you a shell script to pull stuff out of your .nindex if you've got A news; I don't know how B news works. A post from Scott Baden announced (87) that he was in the process of creating an annotated bibliography on two topics: "Functional Programming languages" and about "Applicative architectures." He asked those with any references or comments to e-mail them to him, promising, "I'll make a copy of the bibliography available to all interested parties. If you have already started a bibliography I'd be interested in collaborating with you." News items were posted as was one on the AT&T settlement with the U.S. government posted on January 8, 1982 by Steve Bellovin. The post explained (88): AT&T and the U.S. government have settled their seven-year-old anti-trust suit out of court. Under the terms of the settlement, AT&T will divest itself of the local operating companies; it will retain AT&T Long Lines (the long distance service), Western Electric, and Bell Labs. The reorganization will be completed within 18 months. Questions about Usenet were posted, as in a post by Randy King asking how long it took a post to get to the majority on Usenet. He wrote (89): This may have been answered long before my emergence onto netnews, but I will ask it anyway! Does anybody have a feel for how long it takes a posted article to reach the majority of the netnews community? I realize that there are N! variables here, but a general ordinary run-of-the mill answer would suffice. Two Days? Three Days? A month? Fifteen minutes? (HA). How 'bout it. A response from Horton described the process of distribution of Netnews during this period. He wrote (90): It depends on the newsgroup and where you are. If you are somewhere inside Bell Labs or on a key machine with a dialer (decvax, duke) it will probably get out to 70 - 80% of the net within a few hours. If not, you probably have to wait for an overnight poll, but it will get most places (>90%) overnight. There are some far reaches that won't get it for 2-3 days (more if something is down) and it may take another 2-3 days for a reply or follow-up to get back to you. Horton went on to describe how distribution of the Mailing Lists carried on Usenet occurred. He wrote: The fa newsgroups are different. They are fed in at Berkeley which then waits for ihnss [at Bell Labs-ed] and decvax [at Digital Equipment Corp-ed] to poll. ihnss only polls once a day (in the early morning). decvax calls often. So Bell Labs (which gets most stuff from ihnss) tends to have fa stuff each morning from the previous day. Those getting news from duke or decvax get it randomly, faster depending on when decvax happens to call ucbvax (at Berkeley-ed) usually several time a day. Horton also described other delays affecting how users got news from Usenet. He wrote: And of course there are the delays from the time the news shows up on a system to when any given person actually reads it - often once a day, but some people log in on neighboring machines to get news and don't get it that often. I have gotten replies to queries as much as 3 weeks later, not counting the famous unix-wizards drought where it took 2 months to reach the masses before it even got into Usenet! In summary, he wrote, "But a rough rule of thumb is that by overnight, most of the net will have at least had the chance to read your article." Along with the advantages of being on Netnews were the problems that users were confronted with. One such problem concerned discussion over what was appropriate discussion or in bad taste. Others claimed it was censorship to bar certain discussions. Describing this problem, Horton wrote: Also, PLEASE restrict your "questionable taste" stuff to net.jokes.q for the time being until this whole thing is settled. I am seeing stuff in net.general about dead babies that certainly offends me (and no, I'm neither dead nor a baby) and probably half the rest of the net. I'm still seeing poor taste jokes in net.jokes. There are people out there that are trying not to get this stuff, and they are being barraged with it anyway! This includes limericks - most of them belong in net.jokes.q. If you would be willing to get on your local TV station and recite what you're posting (with your mother and your boss in the audience) you shouldn't be broadcasting it to an equally wide audience of random people. Remember, also, that a record is kept on every machine of everything you say. He also asked for input from those who found such posts offensive toward trying to determine an appropriate policy with regard to such posts. He wrote (91): I haven't been hearing from many people who actually ARE OFFENDED by the net.jokes.q stuff. I'd like to get input from them (either privately by electronic mail or publicly in net.news) in regards to the policy that needs to be formed. How you feel about various proposed solutions is important. Anyone who further understands the Affirmative Action issues should speak up -- I don't claim to understand them very well. Another concern involved what were appropriate posts on Usenet. J. C. Winterton asked that users not post articles from the wire services but instead that people subscribe to newspapers for such information rather than trying to send it around on Usenet. He wrote (92): Notwithstanding the fact that some persons do work for Bell, it STILL costs a bundle to send this stuff around the continent on this network when it is being shipped by the wire services anyway. Why not just subscribe to a large daily newspaper or two. If you really are interested in the entertainment world you can subscribe to Variety. The New York Times and the Times of London probably carry everything else. And where these are unavailable, there are other major papers. I don't believe that Usenet should become an arm of AP, Reuters, etc. I am reasonably sure that they would be somewhat upset with the infringing of their copyright as well. That a thing can be done is not a reason to do it! Besides, by distributing wire service stuff this way (with or without authorization) is probably helping to un-employ some poor newspaper carrier, etc. etc." Commenting on the proliferation of new newsgroups and newsgroup names, Horton promised to issue a list of the newsgroups "officially blessed" to help resolve the problems of multiple names for similar groups. But he also encouraged those with various views on the issue to speak up. He wrote (93): I am coming to realized that people are waiting for me to say something. We are discussing what to do about the proliferation of newsgroups - if you want to be involved in this discussion please send me mail. (We might even, ahem, start a newsgroup.) I hope to have a list of active newsgroups, "officially blessed" (whatever that means), in a few days. Chain letters also posed a problem on early Usenet. Henry Spencer from the University of Toronto posted asking users to recognize the problem and keep it from harming the Net. He wrote (94): Some turkeys evidently have decided it's funny, funny, funny to start sending chain letters around Usenet. With all the mail headers on them, these messages are many Kbytes. For some strange reason, when we're paying phone bills for 300-baud long distance calls, this does not seem amusing. This is EXACTLY the sort of thing that could lead to humorless administrators closing down people's network connections on the grounds that the money is being wasted. For heaven's sake people, STOP IT!!! Your thoughtless empty-headed practical joke is endangering the network that many people worked long and hard to set up! Noting the kinds of problems those on Usenet had to deal with, Horton observed the obligation to those on Usenet to consider its best interest. He urged that those with different views of the issues involved be active and participate in the discussions over what to do (95). "I propose that anyone with opinions on this issue discuss it on net.news. I want to hear from both sides. This is YOUR NETWORK, remember! Others on Usenet had hoped that it would make it possible to form a new form of media or to influence the political process in a way not formerly available. "Not to belittle any new newsgroup," George Otto wrote (96), "but it strikes me that we are developing a real electronic newspaper here." In a similar way, rdg at allegra wrote (97), "Wouldn't it be great to use this electronic medium to send notes to our government officials. I never seem to write postal letters or telegrams, but we all seem to find these electric notes enough to use often. Can you image net.reagan with a few authentic replies." Scott Baden added (98), "Or what if we could lobby our favorite senator? (net.lobby, net.senator?)" The dilemma of funding Usenet posed a problem to some sites as described in the post by Chris Kent at the University of Cincinnati. He wrote (99): We at the University of Cincinnati are on a budget crunch. Therefore, I have been told to cut down on outgoing calls or lose the ability to place them. I ask you all to cooperate, please; try to avoid routing program sources through us whenever possible. We will continue to transship news, so that won't be a problem, but will probably poll only every other day....I am sorry it has to come to this -- but some people higher up seem to see this as just wasted money. I will keep you all posted as to our situation. Chris Kent (cincy!chris) Others like Mel Haas at Bell Labs (houxm) reported that the funding of various sites could be jeopardized by an irresponsible activity on the net and that all users should be aware of the problems that might be caused. He wrote (100): This is a plea to clean up the net. Please! There are whole sections of the net that are being watched by the payers of the bills, and what shows now is not good. The flame and flash content of the past few weeks has far outweighed the useful. Don't revive the "db" stuff in net.cooks! Don't send everything to net.general! and, certainly, don't send anything to both net.general and another! Put net.news stuff in net.news, net.records stuff in net.records, etc. Show some consideration for others in the wording and content of your submittals. He pointed out that responsible use would help establish how the Net was a money saver for the sites participating and thus support continued use. He wrote: Try to make the net a useful exchange of useful information and ideas, that will pay for the service and help people. In other words, make the net a useful tool, not a place to expose yourself, your ego and your bad manners. Thank you. Mel Haas, houxm!mel Explaining his need for access to Usenet even though he was would no longer have Net access through the University of California Berkeley, Michael Shilol wrote (101): I recently graduated from Berkeley where I enjoyed this network very much, both for entertainment and for receiving the latest news on many subjects. I am now starting a job and will soon be losing my account on the Berkeley Vax. My question is: Is it possible for me to get access to this network in any way? Can my company get access to it? Is there a way to pay for this privilege? He noted that the useful technical information available on Usenet was so valuable that a company could benefit financially from being connected, "This network has been so useful to me for finding information that I think it is worth money and/or equipment to get it." And he concluded his post: "Any answers, comments, suggestions appreciated." He then had a form of signature giving both UUCP and ARPANET address forms. Michael Shiloh CSVAX.shiloh@berkekey UCBVAX!shiloh In another post, George Otto at Indian Hill Bell Labs noted the technical superiority of Usenet newsgroups to mailing lists. He described the problem of keeping mailing lists on different computers in sync. He wrote (102): Is anyone working on making mailing lists just as efficient as newsgroups? One problem with using mailing lists for maintaining communications among those in a small group of people is the difficulty of keeping the lists on many machines in sync. I tried looking into setting up a program under my ID that would allow others to mail to me for automatic redistribution to a list I maintained, but never found a good way to do it. He noted that Usenet solved the problem in a superior way by making it possible for people connected to different computers to participate in a common newsgroup. He wrote: The beauty of using Usenet it that members of affected groups can be on different machines and need do one or two simple things to be attached to the common group. Conclusion These posts on NET.general show how those using different computers at a wide variety of different academic and research sites, many of which were not officially sponsored by any funding agency, were able to participate in the kind of collaborative communication and some of the mailing lists formerly only available to those with access to the ARPANET. More importantly, Usenet made the process of posing a problem and collaborating with others to try to determine how to solve it more widely available. Such a process is needed to solve the difficult technical and social problems which computers and networking technology present for our times. Habermas writes that there is a need to understand such a scientific approach to technical issues and challenges. What he doesn't recognize is that the technology itself is needed to help in the process. The early ARPANET as demonstrated through posts on the MsgGroup mailing list and early Usenet provide beginning insight into how people using and directing technology can be part of the important scientific and regenerative process that contributing to the online community makes possible. ----- *Note: The notes corresponding to the numbers in the above article are available from the author or at: http://www.ais.org/~ronda/new.papers/msghist.txt --------------------------------------------------------------------------- _________________________________________________________________ The opinions expressed in articles are those of their authors and not the opinions of The Amateur Computerist newsletter. The Editors welcome submissions from a spectrum of viewpoints. ----------------------------------------------------------------- EDITORIAL STAFF Ronda Hauben William Rohler Norman O. Thompson Michael Hauben Jay Hauben The Amateur Computerist invites submissions. Articles can be submitted via e-mail: jrh@ais.org One year subscription (two issues) costs $10.00 (U.S.). Send e-mail to jrh@ais.org for details. Permission is given to reprint articles from this issue in a non profit publication provided credit is given, with name of author and source of article cited. ELECTRONIC EDITION AVAILABLE Starting with vol 4, no 2-3, The Amateur Computerist has been available via electronic mail. To obtain a copy, send e-mail to: ronda@panix.com or jrh@ais.org The Amateur Computerist is also available via anonymous FTP and on the World Wide Web at: ftp://wuarchive.wustl.edu/doc/misc/acn/ http://www.columbia.edu/~hauben/acn/ http://www.ais.org/~jrh/acn/ _________________________________________________________________ -----------------------------------------------------------------