Netizens-Digest Sunday, April 29 2001 Volume 01 : Number 387 Netizens Association Discussion List Digest In this issue: Re: [netz] Government and Science Was:FC: Ftc action Re: [netz] Government and Science Was:FC: Ftc action (part 2) Re: [netz] Government and Science Was:FC: Ftc action - part 3 ---------------------------------------------------------------------- Date: Sun, 29 Apr 2001 12:15:19 -0400 (EDT) From: ronda@panix.com Subject: Re: [netz] Government and Science Was:FC: Ftc action "Howard C. Berkowitz" wrote: >In much of this discussion, I feel that Ronda and I may be talking at >different levels. I split the Internet into layers, with the lower >layers (my area of specialization) dealing with the internal movement >of packets, not the user visible applications where directory >services come into play. >At the lower layers, I simply don't see the corporate demons that she >seems to suggest. I definitely do worry at the more user-visible >parts. Well you may find it of interest to take a look at the May 2001 issue of Wired. I don't often read it but this had an article by Larry Roberts (one of the ARPANET pioneers) and he explains how he is creating a new kind of intelligent router significantly change the routing done on the Net so that there will be priority service for those who pay more and lower class service for those who can't afford the higher prices. The changes that are being proposed would seem to affect the lower levels as well as the upper levels of the Internet. And what is interesting is that these changes are not being discussed and debated publicly. Instead there is an effort to install them by fiat on the folks in the US and around the world. >>Interesting. As your paradigm leaves out the issue of research >>to find how to do the scaling. >> >>Bell Labs at AT&T is the example of what I mean in some ways. >> >>To have a world class telephone system in the US it was understood >>that there was a need to have a means to support the future development >>and the scientific research needed for that future development. > >>That required support Bell Labs. The way to do that was to regulate >>AT&T and to make sure that it provided the needed support for >>Bell Labs. >I'm not disagreeing that research is necessary. I'm questioning how >the researchers and their laboratories are going to get funded. In my >own case, I am paid by Nortel both to contribute to competitive >product technology, but also to work in the IETF, NANOG, and other >forums. The latter is a business necessity in much of the "lower >layers" of networking -- no one company can dominate the technology >for moving bits around. The business necessity of participating in >cooperative design/standards is less clear at the >application/directory/user interface level. Well depending on who does the funding, that will help set the priorities of what the research is that gets done. For example I have been looking at some of the technical articles written during the early development of the Internet when the research was funded by government. Then the aims of the research were to create a "resources sharing network" and "fairness" of treatment of all packets was an objective. Now the article in Wired reports that the investment community funding the research on charging more for service is eager to fund research that will raise the cost for all to send packets. Its interesting also that Wired hypes what Roberts is doing, rather than offering any critical or social perspective of it. The social goals mean extending the resource sharing capability of the Internet and making very low cost access available to all. The aim being promoted by Wired is end the Internet, create a new network that will cost everyone more and will provide those who can pay access to the best service, and everyone else will be second or third class netizens. >>It turns out that the Internet is also a public utility and it also >>needs a research arm for its development. >> >>The telephone infrastructure in the US was built by a regulated >>process where users pay for their service, and that paid for the >>infrastructure. >> >>Internet development in the US needs to sort out what is the way >>to support infrastructure development, *not* just assume it >>will be a private corporate process. >> >In fact the private corporate process really can't handle the >>development of the needed infrastructure. >Which infrastructure are you speaking of? That which directly >supports research, or the production networks that underlie what >users see as the Internet (and converging services such as telephony). Aren't these related? Doesn't the research arm have to have as a goal the scaling of the production network? >>I think the question has to start with "What is the needed institutional >>form for scaling the Internet? What are the elements of that form?" >But very early in that process comes the question, "how are the >people in that form going to be paid?" Is that the first or the second question? The Bell Labs researchers who worked at AT&T got paid. There wasn't any question of them being paid. The question was what was the goal of their research? Was it a public or a private goal? >Ronda, I consider myself a legitimate researcher in the scaling of >the Internet. I still have to make sure my feline research associate, >Clifford, gets cat food. Are you saying that you can't get a job where the funding is public and so researchers like you have a problem? >>Without the scientific process to try to determine what is needed >>for the scaling, it doesn't matter how much money is poured in. >> >>It will be wasted. > >>One the scientific research is done, then there should be a similar >>scientific approach to determining what form the infrastructure's >>development should take. >We may have some different definitions here of scientific versus >engineering paradigms. Oh -- and there is definitely such a thing as >engineering research. There is engineering research that is what I am referring to as scientific research. I was just reading such a paper yesterday. The paper is what I am referring to as a scientific paper and what you are probably referring to as an engineering paper. It is part of a wonderful volume about Internet research "Proceedings of the IEEE, vol 66, No. 11, November 1978) The paper is "Modeling and Measurement Techniques in Packet Communication Networks" It describes the process of designing networks and designing ways to test the protocols. This is what I am referring to as science :-) >>The paradigm of the Internet was to have a way to interconnect >>dissimilar networks. It seems that that has gotten changed to >>having a backbone that some company(s) create. >Unless it is a government funded utility, what is the alternative? >And how are international backbones funded? This is a problem to be explored. I think in Austria the backbone was built by the government to connect the universities and then the public schools, and the private networks were connected to it and offered service to the private sector. So there was a mixed infrastructure. In the Netherlands there was a debate whether funding should go into extending the train system so it crossed the whole country or building a backbone for the Internet. Eventually I thought they did something like build the national train system and use system somehow to build the backbone for the Internet infrastructure on or connected to it rather than as something separate. By having a public discussion of the issues and the different points of view new alternatives become possible, and ones that will be more in providing for the needed public purpose. >>The original Internet architecture was designed so that it could >>interconnect dissimilar networks under dissimilar forms of administrative >>or political control. >I'm puzzled why you don't seem to think this remains the case. There didn't seem to be any effort with the creation of ICANN to recognize the need to continue to support the diverse administrative and political units and networks. Some of the means of supporting such development with regard to the DNS was there was a country code administrators mailing list that Jon Postel of IANA maintained. Now it seems instead ICANN is trying to assess the country codes administrators what they have to pay to ICANN. It seems the tail is waging the dog. I'll answer further in another email message as this one is already getting too long. To be continued. Ronda http://www.ais.org/~ronda ------------------------------ Date: Sun, 29 Apr 2001 12:38:55 -0400 (EDT) From: ronda@panix.com Subject: Re: [netz] Government and Science Was:FC: Ftc action (part 2) "Howard C. Berkowitz" wrote: >I agree there is a definite concern in open access to broadband >access networks (e.g., IP over cable, DSL) and third generation >wireless access. But access networks and backbone networks are >different things. They perhaps may need different models. One of the people I spoke with at the National Academy last week told me that they are doing a broadband study there and they have decided that it will cost $1000 to connect each household in the US. The study is being done by a small committee in closed sessions and with no public input. Yet it is a public question. In the article on this in Wired this month it seems there is basically a plan to end the Internet and substitute this all purpose network that will be to put tv, telephone and radio and computer data into the same network. But radio and tv in the US and telephone are different kinds of entities under different forms of public oversight. To put them all into one, and to subordinate the online data activity fo computer folk to the princes of owning content is introducting a serious problem and basically seriously jeopardizing the continued existence of what has been the goal and dream for the Internet since Licklider's early writings. >>To the contrary it seemed that WWII demonstrated the need for >>governments to support scientific research. And so after the war >>there was the recognition that this was now an important >>need. For example Vannevar Bush and the important report he >>and others at the National Academy of Science did "Science: the >>Endless Frontier" >WWII rather than Cold War--you are right. I would point out, however, >that massive investment in basic research was more post World War II. >There's no accident, for example, that the discipline of operations >research is named what it is -- it's the use of quantitative methods >to improve military operations. Many of the early OR problems dealt >with antisubmarine warfare. Norbert Weiner's cybernetic research was >given a push because it was useful in antiaircraft fire control. The >first primitive computers generated artillery ballistic tables >(Harvard Mark I), broke enemy cryptosystems (bombe/Colussus), or did >hydrodynamic calculations for atomic bomb design (IBM). But Licklider's background was in brain research, in how complex systems, both natural and artifical function. In studying servo mechanisms whether in the human brain or in the technology that was being developed. >>But a reason Sputnik was developed was because the Russian people >>turned to science to try to prevent another war. >WHAAAT! Unless you are saying they were trying to prevent a war by >preempting or offering a credible deterrent, this statement is >totally at odds with what we know of Soviet policy. Read Marshal >Malinovsky's "Soviet Military Strategy," or look at the declassified >IRONBARK papers from Penkovsky, showing the Soviets' aggressive >pursuit of technologial intelligence for military purposes. The >IRONBARK papers also contain many translations of classified Soviet >military journals. I was referring to the Soviet people and their desire not to have another devastating war and their pressure on the government to support science in hopes of preventing such another war. >>So it is not only the cold war, but the effort to prevent another >>world war that has motivated the events that have led to the >>public support for science that helped to make possible ARPA (1957?)and >>the Information Processing Techniques Office that Licklider started >>at ARPA in 1962. >ARPA. Later DARPA. Always in the Department of Defense. To say that >this agency's motivation was public support of science, and not the >recognition that advanced research supports military development, is >ridiculous. After WWII there was the realization among a lot of people that the US had won the war because of science. You should perhaps look at my papers on this starting with http://www.columbia.edu/~rh120/other/arpa_ipto.txt Licklider was brought to ARPA in 1962 to start an office of information processing techniques. His goal was to catalyze the development of an information processing science. Look at what happened when Licklider was brought back to IPTO in 1974. Then the pressure from industry on the US Congress had had its affect and Licklider could not go on and develop the scientific work he wanted to develop. Putting scientific research under the pressure of applications meant that the research was harmed. This happened in the Air Force Office of Scientific Research (AFOSR) in the Office of Naval Research (ONR) and at ARPA when it was changed to DARPA. The history of this all is very important to understand because it helps to understand the constraint on the US research institutions now. The second chapter of the book I am working on describes this problem. See: Basic Research for the National Defense and the U.S. Department of Defense: A Paradox? http://www.columbia.edu/~rh120/other/basicresearch.txt >>There would be the ability to look further ahead if there were >>the kind of institutional form and support for research that >>IPTO pioneered. >This is questionable. As the total information available increases, >the number of potential interrelationships grow, and the ability to >predict becomes more limited. In 1518, Sir Francis Walsingham >pioneered how to develop a national intelligence service. William >Friedman's work in the early 1900's is the theoretical foundation of >modern cryptography. The brilliance of these men, however, still >doesn't make their work a reasonable foundation for CIA/SIS or >NSA/GCHQ today. Sir Isaac Newton did observe, correctly, that if he >had seen farther than other men, it was because he stood on the >shoulders of giants. But he had to go beyond those giants. My study suggests the opposite. That as science and technology develop it becomes more and more important to have the long range institutional forms or support basic research. Basic research gives you the developments, the pipeline to the future 10 or 20 years later. If you kill the basic research, you don't realize the problem immediately. But in 10 or 20 years you don't have the new developments that were seedlings needing nourishment 10 or 20 years earlier. Let me continue to answer in a new email message as this one is already quite lengthy. Ronda ronda@panix.com ------------------------------ Date: Sun, 29 Apr 2001 12:58:21 -0400 (EDT) From: ronda@panix.com Subject: Re: [netz] Government and Science Was:FC: Ftc action - part 3 "Howard C. Berkowitz" wrote: > >>There is a need to learn from this development and build on it, >>as the experience from IPTO shows that such a institutional form >>is needed to support the Internet's continued development and scaling. >I think you need to quantify this. The rate of growth of the Internet >has been increasing in a non-linear way since the demise of IPTO. >Admittedly, certain mechanisms, never designed for this load, are >faltering. But the seeds were nourished by IPTO. Now that one doesn't have IPTO it is much harder to have the long term research that is needed for scaling the Internet in a way that is in the public interest. >>To me, netizens have to operate in such a politicized environment. > >>Yes that is true. The politics of science has to be taken on. > >>But that means recognizing that there are "vested interests" and in >>the past there have been ways of inhibiting the damage they can do :-) >But there's also the issue of not demonizing them, and understanding >what is and is not broken. I freely admit that the directory and DNS >situation is in terrible shape. I agree it isn't to demonize them. And I agree that we want to understand what is and isn't broken. But after studying the incredible capability and achievements of the researchers who created IPTO and the IPT community and the important computer science and networking developments of our time under its protection and leadership, it is hard to see the kind of pressure that companies like MCI/Worldcom and others exert on folks in the US Congress and other government officials. I went to a congressional hearing and saw that the congressman from Mississippi was there to take care of MCI/Worldcom not the public. And a staffer said that the Congress folk only hear from industry and in such technical matters he claimed they had no ability to evaluate what they were being told to do by the industry folks pressuring them. Also I went to a town meeting at Columbia about the Internet and its future. The people holding it, it turned out, were CEO's of companies. They claimed they were there to hear what people had to say so they could tell the new President. But mainly what people had to say was that they didn't like the commercial activity and taking over of the Internet. Clearly the CEO's at the meeting aren't going to tell the new President that. So I agree that one doesn't want to demonize anyone. But the corporate folk that I have been around until recently didn't think that there was anything else in the world but the needs of their corporations. >The routing and transmission layers of the Internet have known >scalability problems. But the many organizations that run them are >very actively working to solve these problems. I spent the morning on >an international conference call with a mixture of commercial and >academic researchers planning experiments in refining our >understanding of current scaling problems. There is every intention >to publish the results and gain consensus on solutions. There are folks from companies like Digital and the regulated AT&T that played important roles in creating Unix or Usenet etc. So it is important that researchers collaborate, whether they be from industry or academia etc. However, with IPTO there was a general purpose objective. With Microsoft, there is a narrow self serving objective. And it seems that the general plans for scaling the Internet are are being subjected to public discussion and consideration so that the research can be done in a general way, rather than to increase some sector's profits. That is why I feel we need a research institution like IPTO again. >>Well I didn't think that the regulated AT&T that developed the >>world class telephone infrastructure in the US was any >>"socialist system" though I am sure that MCI/worldcom might say >>it was as that was their effort to end the deregulation and the >>benefit that MCI got as a result. >To quote back, "regulated". Modified socialism. The 1913 Kingsbury >Compromise gave AT&T effective control of long-distance >communications, which provided the revenue stream for building their >backbone and funding Bell Labs. Somehow to equate good government regulation with "socialism" even "modified socialism" seems to be a lack of recognition of the importance that regulation plays to make science possible. The US has excelled in scientific activity when it has protected and supported scientific research. This took good regulations. >We think of Bill Gates as a modern networking robber baron, but I >rather revere Theodore Vail, early CEO of AT&T, for some incredibly >manipulative business practices that created a world-class telephone >system. AT&T was regulated. This was good regulation. Bill Gates fights regulation tooth and nail and thus the world is stuck with a windows operating system that perpetually crashes. Research is needed not only in the technology but also in the social forms and institutions needed to nourish and provide the soil for the technological development. Cheers Ronda ronda@panix.com ------------------------------ End of Netizens-Digest V1 #387 ******************************