Draft for Comment Computer Science and the Role of Government in Creating the Internet: ARPA/IPTO(1962-1986) Creating the Needed Interface by Ronda Hauben rh120@columbia.edu Section III Centers of Excellence and Creating Resource Sharing Networks We kept ARPA low profile to keep it protected, because there were an awful lot of big guns going off over our heads Congress versus Defense. And different wings in Defense having different ideas. And the Vietnam war...and I wanted to make sure that ARPA, which really wasn't an outfit with all that much clout, didn't get clonked in the process. I used to use the Russian proverb: `If you're a clay pot, don't get caught on the stove with iron kettles....' This was sure different from the early ARPA which was very high profile...reporting in at Presidential levels for all practical purposes. And it was working on Presidentially important things. That was the way it got set up. And Herb York was calling the President's office a fair amount of time....That was the original ARPA. Barber, viii-14 I - IPTO and Creating a Computer Science Community Reviewing the experience of research agencies such as AFOSR in the Air Force and the ONR in the Navy helps to set a framework to consider the experience of the Information Processing Techniques Office (IPTO) created inside ARPA in 1962. Both AFOSR and the ONR had some experience supporting basic research in communication theory and ONR was supporting the development of a new form of computer organization called time sharing in its contract for the development of the Compatible Time Sharing System (CTSS) at MIT. In 1962 JCR Licklider was invited to set up an office inside of ARPA to explore new uses of the computer beyond arithmetic processing. Licklider had experience with these research organizations and had been on the Air Force Scientific Advisory Committee for several years before joining ARPA. (54) He was familiar with the type of scientific problems facing the Department of Defense and was interested in exploring them. Also he believed that at a general level the interests of the university research community and the interests of the military were the same, as "what the businessman needs is what the scientist needs." However, he also recognized that at the application level the interests would diverge and then problems could develop. During his first turn at ARPA, he reports that "I did not feel much pressure to make a military case for anything...I tried to convince people of the philosophy that in general the same thing is needed." (55) Licklider had written a seminal paper exploring the question raised by Norbert Wiener about what role should the human and the computer each play in the relationship between the two. (56) Licklider's paper "Human-Computer Symbiosis" established that the relationship between the human and the general purpose computer should be one of a rapport, with each partner doing what they do best, rather than a relationship where the human was enslaved by the machine, or the machine was treated as a servant of the human. (57) Licklider outlines the need for the human to be able to interact with the computer as opposed to the batch processing form of computing that the computer industry at the time saw as the future for computer development. He proposes a program for computer research that would create the change from batch processing to the interactive form of computing that he envisioned. (58) During this period of the early 1960s, research in computer science in the U.S. was limited to a few US Dept of Defense related institutions like Rand, MITRE, IDA, System Development Corporation or MIT's Lincoln Labs. (59) When there was any study of computers in universities, it was usually as part of some other department like an electrical engineering department and often had to do with the hardware of the computer. Licklider recognized that he would have to develop a new field of scientific study, a field which would come to be known as the field of computer science or the science of information processing. (60) To do so, he set out to build on his experience as a research scientist working with AFOSR and ONR and attending DoD sponosored research programs such as Project Lexington (1948) and Project Charles (1951). Licklider had also attended study circles organized by Norbert Wiener in Cambridge after WWII. (61) 1. Creating Centers of Excellence With help from AFOSR and ONR, Licklider was determined to create centers of excellence in information science. During this period, for Licklider, the military meant the research offices like ONR and AFOSR. And he would be building a research infrastructure in collaboration with them. Licklider explains(62): In my area we were cheating a little bit, because when we talked about the military we talked first about ONR and AFOSR and the ARMY Research Office. Well, Marv Dennicoff and ONR were close personal friends...I knew if I shook hands with him about something there was no question. A similar close relationship existed with Charles Hutchinson of AFOSR. Though there were certain obligations, as Licklider explains, such as visiting Fort Knox and the interior of a mountain in Colorado to keep current about applications that the Department of Defense was using, "The military development people weren't really in our circuit." (63) In a memorandum to G. Fubini, Assistant Secretary of Defense, Licklider describes his concept of centers of excellence(64): We are here particularly concerned with advancement and exploitation of digital information processing and communication, and there is a great need to incorporate those advances into the working knowledge of the coming generation. One center was to "lead the effort to achieve balance in information technology, to harness the logical power of computers and make it truly available and useful to men." (65) Another was to "lead the effort to achieve fundamental understanding to develop the theoretical bases of information processing." (66) The first two centers of excellence were at MIT and Carnegie Institute of Technology (later known as Carnegie Mellon University). Licklider notes that these two sites were "selected with great care." (67) By the end of 1967 there were 8 university campuses included in the IPTO centers of excellence program. These were Carnegie Mellon University (CMU was then Carnegie Tech), MIT, University of California at Los Angeles (UCLA), Stanford U, University of Michigan, University of Illinois and the University of California at Berkeley (68). A memo by a subsequent director of IPTO, Robert Taylor lists four objectives for the centers of excellence. They were (69): 1. To bring researchers from different disciplines together for the purpose of solving common problems. 2. To accrue the advantages of applying research and problem solving techniques from one field to another. 3. Through the economies of scale to increase the amount and value of research without a concomitant increase in cost (e.g., by taking advantage of central facilities and common resources). 4. To increase the production of advanced degree graduates with interdisciplinary training. MIT's Project MAC was the first center of excellence set up by Licklider. Its research program included time-sharing systems, programming languages and programming systems, computer-aided design, computer directed instruction, heuristic programming, information processing and communications, input/output systems including graphics and intelligent systems. (70) CMU requested funding to become an "IPTO center of excellence" in 1964. They proposed research "to understand the nature of information processing, by which they meant systems that process and transform information and the ways information processing systems are used to control, integrate, and coordinate other systems." These included "the fields of control systems, information theory, documentation and information retrieval, logistics, dynamic programming, modern logic, and statistical decision theory." The focus of their proposal was "oriented toward software and programs." (71) They wrote (72): We are empirical, in that we believe in constructing programs that do things, and in learning about information processing from the difficulties of construction and from the behavior of the resulting programs. We are theoretical, although not so much by a dependency on formal models (such as automata theory) as by trying to formulate the essential nature of information processing. Thus many, although by no means all, of the tasks for which we build programs are selected for the understanding they yield and not for their usefulness in applied work. This particular combination of theory and empiricism stems from the view that the key problems today in the science of information processing are those of discovery, formulation, representation, and immediate generalization -- and that we are not yet at the place of building very elaborate or formal mathematical structures that are significant. Describing the program at CMU, Norberg and O'Neill write (73): CMU categorized their interests as falling into several problem areas. Problems of discovery included, for example, investigation into algebraic compilers, formula manipulations, monitors and chess languages. Problems of integration involved a dual process. One approach to integration was the development of a formal theory of information processing systems. Such formal theories consisted of a postulated basic scheme for the representation of all information and a set of primitive information processes out of which all more complex systems could be fashioned. The other approach to integration was to incorporate all of the important features of the separate strands into a single unified programming structure. The two approaches were under investigation using the languages ALGOL and COMIT. Norberg and O'Neill elaborate: One category concerned problems of proof, that is, proof of assertions about programs. Another category was called problems of efficiency, that is, efficiency of a program in processing information. There were problems of representing information and problems of advancing problem-solving power. The question of treating very large files was to be addressed in problems of mass information. Lastly, there were the problems of communication between man and machine. Crucial to Licklider was the quality of the researchers he was funding. When asked how he chose the universities to become centers of excellence, Licklider noted that he was interested both in the reputation of the people and of the universities (74): I had been going to computer meetings for quite a while - I'd heard many of these people talk....There is a kind of networking. You learn to trust certain people, and they expand your acquaintance. I did a lot of traveling, and in a job like that when people know you have some money it's awful easy to meet people; you get to hear what they are doing. Licklider describes how he would encourage the people at the different centers to meet and interact. Describing the requirements for the contract at MIT, Licklider explains (75): I wanted interactive computing; I wanted time-sharing. I wanted: `Computers are as much for communication as they are for calculation'....Then I wanted assurance there were going to be good people working on it....I wanted a summer study that would bring people from all over the industry, that was going to try to shape up this field and make it clear what we were doing. Also Licklider indicated he wanted input from the researchers into how the program was developing. Toward this end he describes how he would get "an MIT person to visit SDC, or...people to take time off from their research to have meetings to think how all this was going to go." (76) Alan Newell at CMU describes how the funding from IPTO helped to create a center of excellence at CMU. Asked what they did with the DARPA funding, Newell responds (77): Newell: I don't know...support people...just spend it. We didn't decide to do anything with it. One of the features of this environment was that it was decidely un-entrepreneurial. That seems, in one respect, like a contradiction in terms, but we never took these funds and decided we were going to go out and do big things with these funds....I mean, look, we're fully engaged in research. We're doing just what we want to do. We have graduate students to support....The symbol of this is that we had an agreement with DARPA in the '60s,...There was a computation center budget...DARPA would pay 55%; the school would pay 45%. That's how we spent some of it. No questions asked....Every member of the computer science faculty...was supported by DARPA automatically.... There just was the total faculty. so the theorists, the linguists, whoever was there - I mean not totally supported, there was the GO [general operations]...in fact the school never gave us a dime fundamentally....(T)he ARPA funds were fueling it...I didn't believe in projects. Didn't have projects. All you had were people doing science. Not little fiefdoms. It was all community. All the students were funded out of a common pot, so the students just worked with whomever they wanted....we ran everything jointly; people are all treated alike. There are no boundaries; there are no laboratories in this department. All of this comes out of the ARPA tradition, as it was evidenced in the school here and out of this communication sciences program, this interdisciplinary program that preceded with computer science, which was genuinely interdisciplinary without the kind of constraints that force all human beings into living the way you live in Minnesota, the way everybody lives everywhere. So, we ended up with a place where graduate students picked people to work without a concern for where they get funded, totally and completely. In the early days, everyone was funded no matter who you were. There were no field differences....Anyone for whom there was a belief by people in the system that someone else in the university ought to be on our machines and share our resources, we would simply add without question. 2. Disseminating Research Results Widely Along with the creation of centers of excellence, Licklider and future directors of IPTO sought to provide wide dissemination of the research done by the researchers they funded. Toward this end, IPTO supported domestic and international conferences and publication of the research it supported in conference proceedings and journals. Along with papers presented at conferences like AFIPS (American Federation of Information Processing Societies), IPTO supported the dissemination of research, even sending researchers abroad when invited. For example, Donald Davies was a computer science reseacher at the Computer Science division of the National Physical Laboratory in Great Britain. He learned of the research going on in the US and invited IPTO to send researchers to present their work. In November 1965, IPTO sent 10 people, including Jack B Dennis, Fernando J. Corbato, Larry Roberts, Richard Mills, and Ivan Sutherland, who was then head of IPTO. (78) to speak at a meeting organized by the British Computer Society. Davies reports that though most of the discussions were about operating systems aspects of time-sharing, the research done to show the mismatch between time-sharing and the telephone network was described. (79) He writes that (80): It was that which sort of triggered off my thoughts and it was in the evenings during that meeting that I first began to think about packet switching. "The basic ideas," he continues, "were produced really just in a few evenings of thought, during or after the seminar."(81) The ideas Davies is describing are ideas about how to create a new form of communication transport using the concept used by the MIT researchers to create time-sharing. Computers produced data in bursty groups rather than in a steady stream. In the telephone system, a circuit is opened up and maintained as long as the caller is using the phone for the call. When the call ends, the channel is closed. Computer data, however, comes in spurts and doesn't require the reserved channel that voice needed. Roberts and Tom Marill had done research demonstrating the misfit between the transport of computer data and the telephone system. Their experience was described in the paper "Toward a Cooperative Network of Time-Shared Computers," (Proceedings-Fall Joint Computer Conference, AFIPS 29, 425-431, Washington, DC, Spartan Books, 1966.) Davies recognized the nature of this problem and conceived of a new way of multiplexing data using packet switching as a way to solve the problem. Multiplexing meant that different kinds of data would be integrated into a single stream and then would be reassembled when it reached its end point into the particular data applications. In an article about the nature of packet switching networks, Robert Kahn describes why packet switching is a particularly appropriate means of transport for computer data (82): Due to the bursty nature of computer traffic and the extremely low utilization of a typical voice grade circuit by a terminal, a substantial portion of the data- communication capacity in a circuit-switched system is simply not used. Message switching employs a generalized form of multiplexing for a network environment that allows all circuits to be shared among all users in a statistical fashion without being allocated in advance. This has been the motivation for the development of new communications systems as well as combined computer communication networks. Hence the wide dissemination of research results by IPTO reseachers, not only helped spread the results broadly and widely, it also provided a means for collaborative work to solve the problems raised by the research. When Davies thought out his ideas for message switching he decided on the name "packet switching" to describe how a message is broken up into parts that he called packets, then transported across this new form of computer communications network, and then reassembled at the other end. Davies describes how he scheduled a talk to see if there was interest in the subject. He announced that he would give a lecture in March 1966 at NPL. Describing the surprising response he received to his announcement for the lecture, he writes (83): I don't recall that I actually invited anybody by name. It's very likely that they came as a result of a general invitation going out on the notice board...(I)t gave us a tremendous surprise when about a hundred and twenty people turned up and overfilled the lecture theater. It was something really very unusual to have people standing at the back and so on, so this immediately told me that the subject was one of general interest.... Several people from the British Post Office (which included the telephone company) attended Davies lecture and he was surprised that they didn't dismiss the idea as he would have expected (84): In fact, the interest of the Post Office rather surprised me. The fact that so many people turned up and that they came from a fairly high level. Davies describes how a member of the UK Ministry of Defense attending the lecture asked if he knew of the work "On Distributed Communications" by Paul Baran from Rand in the U.S. This introduced Davies to Baran's research, proposing a similar kind of distributed networking communication, but for voice rather than data.(85) Davies also wrote up his ideas in a report "Proposal for a Digital Communication Network" in June 1966. The report was distributed in the UK and abroad. Davies remembers that a copy went to Larry Roberts at IPTO.(86) Time-sharing development had spread in the late 1960s and there were subsequent industry efforts to offer a form of commercially available computing power to people. By 1966, the need to connect time-sharing systems became a goal of IPTO research. And research by Len Kleinrock in queuing theory along with Davies research and Baran's helped to convince Roberts that creating a packet switching network would be a fruitful line of research. (87) Trying to understand the influence of his ideas, Davies asked Larry Roberts what influence his Report had had on Roberts. Davies explains that (88): The impression I got, from the things he (Roberts ) said, was that it was not so much the technical ideas in my report, but the fact that we were enthusiastic and believed it would work and that it could be made to operate quite easily. That is, I think, the main criticism that we heard at the time. It would work, but how difficult would it be first to convince people to adopt an entirely new way of communication. And secondly, all experience in message- switching showed that the software problems were very difficult. Well this is absolutely true, they are. Davies circulated his early packet switching proposal report "with the aim of reaching all the people who might be informed....Now one copy of it certainly went to Larry Roberts," he remembers, "and when I visited him in the Pentagon on one occasion, it was lying on his desk in tatters. It had obviously been very heavily thumbed and turned over, and he grilled me on a number of aspects of it."(89) Thus the marriage of computer and communications research that Davies describes as the challenge represented by the effort to create packet switching network became the next focus for IPTO. The wide dissemination of IPTO research and the interaction with researchers like Davies in other countries helped to set the agenda for future research goals at IPTO. 3. The New Research Focus at IPTO In 1966, Robert Taylor, the head of IPTO, brought Larry Roberts to work at IPTO to develop a packet switching network. IPTO funded BBN to develop a packet switching subnetwork of minicomputers called IMPs (or Interface Message Processors).(90) Dave Walden, who worked on the project, describes the advance that packet switching represented. Even during this early period, there was a realization that research on packet switching would have a significant effect on the future. Following is an excerpt from an interview with Walden conducted in 1990 where he describes the thoughts about the future social impact of these developments among those working on developing the ARPANET. Walden recalls (91): We believed, I think, that we were going to change the way communication was done; we had no doubt. However superficial our analysis or however profound our thinking might have been, we were convinced that packet switching was the technology that was going to change the world. It was going to go everywhere. And it has proven to be correct. We knew that very early. It is such a natural...tradeoff is the wrong word...combination of the good aspects of circuit switching and the good aspects of message switching. It so dominates either in all but a fraction of the cases. And the technology trends were just going to make it be better and better. What really happened, I don't know...what I think really happened is that computers got fast enough so you could do switching with software rather than switching with hardware. Up until that time, the computers were too slow and therefore to make the switches go at microsecond and millisecond times you had to do your switches with hardware. Well, switching with hardware can't be very complicated. It's that simple. Suddenly, we could do our switching with software. I always knew, I think we always knew, although we didn't always say this out loud, that the part of packet switching which is buffering the individual packets was an engineering tradeoff that was right for the time. The fact that we were doing switching with software was the bigger key. And what we are moving now with the software switching, without the local buffering, things go flying through external buffers and there's fiber cables switched across the country on microsecond bases, this whole gigabit networks technology. I think that from very early on we understood, probably Kahn understood it before that, I understood it, let's put it that way, from very early that the key thing here was that we were doing the switching with software. Because the computers now went fast enough. As it happened, the memories were cheap enough so you could do the local buffering, but it was only a matter of years - whether it was ten or twenty - before the hardware would become cheap enough so you wouldn't necessarily have to buffer the stuff locally. But you still had the software based switching. Computers got to that point, you do the switching with software. Didn't have to take it in and store it for an hour and look it up in some big table somewhere. Well, the line costs were coming down. So the fact was that you were using them a little inefficiently. The bandwidths were going up and the relative costs were going down so the fact that you only used them at 60% of capacity wasn't a disaster. Message switching was optimized for let's fill the lines up, every hour of every day. I think there is another thing that I believe Roberts understood, Kahn, and those people who are real thinkers understood early on, certainly, the rest of us realized quite quickly, which is that putting in this infrastructure would change the way the world worked, not the communication world, but the way people worked. From the first time we sent a message across the network or wrote a paper across the network, none of us had any doubt that what you are seeing today with thousands of distribution lists and virtual networks and worldwide queries. It was going to happen. So we thought we were changing the world, I think. Walden describes how the ARPANET research work involved the marriage of communications research and real time computer systems research. He notes that Bob Kahn had come to BBN to do research in communications. Kahn had a Ph.D. in applied mathematics and communications theory from Princeton and had been on the faculty at MIT. Others like Frank Heart, William Crowther and Walden had worked in real time computer systems work at Lincoln Labs. And Severo Ornstein had been a computer hardware expert who had also worked with Heart at Lincoln Labs. Along with other skilled software engineers like Bernie Cosell, Alex McKenzie, Ben Barker, Jim Geisman, Martin Thorpe and Truett Thach, they formed the IMP guys at BBN. The BBN group won the IPTO contract to build a prototype packet switching network to connect the different ARPA/IPTO contractors so that they could share their computer resources and collaborate. By 1971, they had several nodes connected across the US. And their research was being presented in conferences and published in journals raising interest in the US and around the world in the important new development that packet switching networks would promise for the future. (92) Stephen Crocker, a graduate student at UCLA doing research there developing the ARPANET, went to work at IPTO in July 1971. Reflecting on the widespread interest domestically and internationally in computer networking during this period, he reports: The first year I was at ARPA, I wound up going to Europe three times...And a lot of it had to do with networking; I was invited to talk. Even as a graduate student at UCLA doing networking research, he had received many invitations to talk. He writes (93): When I got involved originally in networking, I got invitations to talk nearly everywhere. So in addition to traveling to different sites, there were different meetings. It was a hot topic; everybody wanted to hear about it. This continued at ARPA, and it threatened to continue internationally. 6. Utilizing the ARPANET at IPTO Also Crocker explains that those working at IPTO explored how they could utilize the developing ARPANET network to help with the work at IPTO. And this in turn changed the way the work was done at IPTO. He observes (94): Networking really changed the character of the office, because now there were all these cross-currents to connect people together, and networking became a pervasive technology. For example, IPTO used a new conferencing system called Forum to interview someone who was considering a job offer at IPTO. Crocker observed how the online terminals made it possible for several different people from IPTO, including Bob Kahn, then a program manager, Stephen Lukasik, then Head of ARPA, and Tachmindji, Deputy Director of ARPA, Larry Roberts, then director of IPTO, and Crocker, then an IPTO program manager, to talk with a job applicant from their different homes. Crocker remembers (95): It was really a neat experience because several threads were going on in the conversation, and you could keep track of them, and the bandwidth was actually higher than if we were all in one room where only one person at a time could talk. Here everybody could type whatever they wanted and the paragraphs would just come trooping out. You can read faster then you can type, so it was more exciting than you would think. 7. IPTO demonstrates the ARPANET While the ARPANET was connecting computers during this early period in its development, there was still little that could be done with it. To change this situation, Bob Kahn and Larry Roberts considered proposing a demonstration of the ARPANET and of packet switching at an upcoming conference. The first International Conference on Computer Communication was planned to take place in Washington, DC at the Hilton Hotel in October 1972. Kahn used the event to encourage the different ARPANET sites to develop ways to utilize the network. He devoted a year to planning and working with the grassroots IPTO community to create interesting programs and demonstrations. Also Kahn got different vendors to cooperate by contributing equipment, etc. Describing the demonstration, Kahn remembers(96): Actually, that demonstration was what made the ARPANET real to others, because there was a lot of skepticism before: people could see that packet switching would really work.... It just did not quite seem like you could communicate by breaking messages into packets and shipping them; circuits seemed to be much more reliable to the untrained eye. Well the demonstration was a major success. We had many different vendors contributing terminals. We had just about everybody involved in networking at the time there. Over a thousand attendees came through the conference. It was a hands-on, live demonstration. Many of the people who were involved are well-known leaders in the field today. It was a major event. It was a happening. (to be continued) Footnotes 54. An Interview with J.C.R. Licklider conducted by Willam Aspray and Arthur Norberg, October 28, 1988, Charles Babbage Institute, pg. 25 55. Licklider Interview, pg. 26. In the interview, Norberg asked Licklider: Is it possible to separate out military interests from the interests of this community around Cambridge in the use of computers and meeting objectives? Licklider responded: I think of it this way: you can't make any clear cuts if you look at the thing in terms of big block diagrams, becaus what the military needs is what the businessman needs is what the scientist needs. But look more sharply -- look ahead a block. Take speech understanding, for instance. Here, the scientist wants continuous discourse, wants not to tranfer the individual person. The military person, or the intelligence person wants to recognize a few critical words -- "We'd like to be able to pick out Secretary of Defense" -- but is less interested in the dictating machine. So when you get down to the specific task they're really quite different. So take a project doen around here: making a computer simulation of a Morse Code operator, so that you can hook the computer in the net with people. That's an artificial intelligence problem, and academics get tremendously interested in it. They are simulating in it the planning capabilities of the person, as well as the reception of Morse Code. Military people want something that will work, and not something that will advance the theory of how to do AI. But they will both be happy with exactly the same project if it has both facets. (Interview, pg. 26) 56. Wiener's book was "God and Golem, Inc: A Comment on Certain Points Where Cybernetics Impinges on Religion", Cambridge, MA, The MIT Press, 1956. 57. Licklider's paper "Human-Computer Symbiosis". In the Babbage Institute Interview with Licklider, he explains that "people like Minsky and McCarthy were primarily interested in artificial intelligence, and tended to view man-computer interaction as a neat and convenient thing to make it possible to write AI programs, while I thought there was going to be this interval between man's thinking about himself and machines taking over. I do not know how long the interval was, but it looked like a considerable interval, when working with the computer was of the essence. So, in short, I really believed it, and quite a few people in the area here thought that something really great was going to happen." (pg 21) 58. See the program proposed in Licklider's paper "Human-Computer Symbiosis" See also Norberg, pg 133. 59. Norberg, pg. 133 60. The term used during this period to describe the science of computer science was information processing. 61. Licklider interview, pg 9 See also Chapter 6 in Netizens: On the History and Impact of Usenet and the Internet, IEEE Computer Society Press, 1997, pg. 79-82. 62. Licklider interview, pg 31. Also Licklider describes his intention to create a research infrastructure with ONR and AFOSR in Licklider interview, pg. 23. 63. Ibid, pg. 32. 64. Norberg, pg 135 65. Ibid. 66. Ibid., pg. 136. 67. Ibid. 68. Ibid. 69. Ibid. 70. Ibid. 71. Ibid., pg 136-137. 72. Ibid., pg. 137. 73. Ibid. 74. Licklider interview, pg. 27. 75. Licklider interview, pg 34. 76. Licklider interview (page?) 77. Newell interview, pg 23-25. 78. An Interview with Donald W. Davies, conducted by Martin Campbell-Kelly, on 17 March 1986 National Physical Laboratory, pg 6. 79. See for example, "Marill, Thomas and Lawrence G. Roberts, "Toward a Cooperative Network of Time-Shared Computers," Proceedings-Fall Joint Computer Conference, AFIPS 29, 425-431, Washington, DC, Spartan Books, 1966, and Interview with Davies. "Actually, most of the discussions tended to be about the operating system aspects, but certainly the mismatch between time- sharing and the telephone network was mentioned. It was that which sort of triggered off my thoughts, and it was in the evenings during that meeting that I first began to think about packet- switching." (pg 6) 80. Davies Interview, pg 6. 81. Ibid. 82. Resource Sharing Computer Communications Networks, in Proceedings of the IEEE, Nov. 1972, p. 1398 83. Davies Interview, pg. 8 84. Ibid. 85. Ibid., See also Norberg and O'Neill, pg. 235-236 86. There's a helpful description of packet switching in Norberg and O'Neill pg 234. They explain: A new approach to connecting computers was needed because the existing communication systems and terminal-oriented computer networks were too limited for the requirements of time-sharing systems. Existing store-and-forward message systems did not provide interactive response. Connecting each time-sharing system to all of the others was too expensive. For example, to fully inteconnect eighteen computers would involve over 150 leased lines. The sporatic nature of interactive computer use was coupled with the way telephone service was charged made dedicated circuits uneconomical. A new scheme was needed. Packet-switching was a solution to these problems. Packet- switching, as applied to the problem of connecting interactive computers, was an innovative application of existing communication techniques to a new problem. Roberts described it this way: "packet-switching technology was not really an invention, but a reapplication of the the basic dynamic-allocation techniques used for over a century by the mail, telegraph and torn paper tape switching systems." (Larry Roberts, "The Evolution of Packet Switching", Proceedings of the IEEE, 66 (November 1978) 1307-1313. In packet-switching networks, messages were broken into discrete parts instead of sending the entire message intact through the store-and-forward system, as was done in message switcing systems. Each discrete part (or packet) contained header and control information along with the text. Header information allowed routing of the packets by specifying such items as identification of the source and destination of the packet. Control information, such as checksums, was used for error checking. Each packet was put into the correct format and sent out into the network. A packet was temporarily stored at the next location, then sent out from that location and continued to be stored and then forwarded until it reached its final destination. By using packets the data travelled over lines that were shared intermittently by others; packet-switching did not require a dedicated end-to- end circuit for the duration of the message. The packets from many different messages could be sent on a line, so more than one user could send information to the same location at approximately the same time on the same line. The network handled the decomposition of messages into packets at the source and their reassembly into messages at the destination, including all checking, retransmission, and error control. Because packets were routed through the system to locations that were not directly connected by a circuit, fewer nodes needed to be directly connected than would be required in a totally connected network.See Vinton G. Cerf, "Packet Communication Technology," in Franklin F. Kuo, ed., Protocols and Techniques for Data Communication Networks, (Englewood Cliffs, NJ: Prentice-Hall, 1981), p. 1- 34. For example, if node A was connected to nodes B and C, nodes B and C could communicate through A without being directly connected to each other. There were many different paths that a packet could take when routed through a network. One way of organizing the network was to choose the path in advance, based on the destination of the packet. Alternatively, a packet could be dynamically routed through the network, so that at each node a routing decision was made based on the destination and the status of the network. Since in dynamic routing the path was not preassigned, each leg in the path was chosen only when the packet needed to be sent on to the next leg. This arrangement avoided unavailable nodes. (Norberg and O'Neill, pg 234-235) 87. Davies Interview, pg. 9. Davies explains why he didn't formally publish the paper, though the paper was widely circulated. "I suspect that it would have been quite difficult to get it published in a learned journal because it doesn't have much in the way of original ideas in it. Maybe if I'd gone to a lot of trouble to rewrite it I could have got it published in a prestigous journal. At that time it was certainly much easier to get papers published in conference proceedings and I took the easy way out. I think that's the trouble - I mean conference proceedings you can easily get published, but they don't make such good references for the future. Of course ARPA was very similar. ARPA papers were mainly published in conference proceedings, so we were both doing the same thing. And that particular paper would have needed complete rewriting. In fact I published half a dozen papers of similar kind in various conference proceedings around that time. But you're quite right I think - a bit more care in publishing that much earlier would have been valuable. Of course the right types of journals didn't exist. The ACM I think would have been very snooty about it. The only place I can think of that would have taken it in those days would have been the Transactions of the IEEE. You could get papers published within six months there, so I might have done better to have done that. (Davies Interview, pg 11-12) Also Davies explains that though he saw the report on Roberts' desk in the Pentagon when he visited him there on an occasion, and Roberts questioned him about it, Davies felt that for Roberts the report was more important as an encouragment, than for the technical ideas it contained. 88. Davies interview, pg 10. Also Davies explains that the reason they could be enthusiastic was that their conception of how to do so was a simple model different from the human model. While "the human system, which was really quite complex and involved a certain amount of subtlety in dealing with particular kinds of failures. The idea of sending messages - packets - and relying on lack of any response to send them again was unlike the way human systems work. I believed that by doing it in a number of layers in this way - having an end to end protocol, and protocol over each link and so on - one would make software problems a lot easier, and I think it did turn out to be so. The general impression was that software problems would prove to be quite intractable....That was at the stage where we already had some experience of the software problems at NPL, so we were quite convinced we could actually make it work. 89. Ibid. pg. 9. 90. In his interview, Licklider describes how he tried to start a networking project at UCLA during his first turn at ARPA. However, it didn't succeed. Norberg and O'Neill report that ARPA head Bob Sproul described ARPA's intention to set up a computer network in Congressional hearings in spring of 1965. They quote Sproul: a computer network is now being defined which will distribute computation equipment for passive defense much as modern military communications systems distribute circuits and equipment for reliability in the face of enemy action. This network will serve as a test bed for various computer networking concepts. We can foresee the day when such conputer networks will automatically distribute computation and information to diverse users. ( from House Subcommittee on Appropriations, Department of Defense Appropriations for Fiscal Year 1966, Hearings, 89th Congr, 1st Sess, March 30, 31, April 5, 7, 9, 13, 1965, p. 535.) quoted in Norberg and O'Neill, pg 57. 91. An Interview with David Walden, Conducted by Judy O'Neill on 6 February 1990, Cambridge MA, Charles Babbage Institute. 92. See for example the ICCC'72 Proceedings. 93. Steve Crocker Interview, Charles Babbage Institute, pg. 21-22. 94. Ibid., pg. 24. 95. Ibid., pg 30. 96. Interview with Robert Kahn by William Aspray, Charles Babbage Institute, pg 5-6. Last updated January 23, 2000 part I http://www.columbia.edu/~rh120/other/arpa_ipto.txt part II http://www.columbia.edu/~rh120/other/basicresearch.txt part III http://www.columbia.edu/~rh120/other/centers-excellence.txt part IV http://www.columbia.edu/~rh120/other/computer-communications.txt part V http://www.columbia.edu/~rh120/other/birth_internet.txt