Thank you Patrick Collison
Patrick Collison spoke about this book on a podcast I listened to, I forget which one. He was talking about how the tech industry doesn’t look back much on its history, it mostly looks forward, which is both a positive and a negative.
Patrick was recommended the book by Alan Kay who said it was one of the rare books that got the history of Xerox PARC right. Patrick bought it on Amazon, loved it so much that he started gifting it to a bunch of people and exhausting Amazon’s supply. The book was out of print, so Stripe Press started republishing it, and thanks to them we have this gem in a beautiful hardcover format.
So, thanks Stripe Press for reprinting the book, thanks to the Sloane Foundation for funding it, and thanks to Mitchell Waldrop for researching and writing it over 9 years! I think we could have lost some key stories about the creation of many precursors to the Internet without it.
I think Licklider has more claim than any other single person to being the individual causally responsible for its [the Internet’s] creation.
– Patrick Collison (source)
Summary
The book is about the life and impact of JCR Licklider. Who’s that? From the Internet Hall of Fame:
In 1962, Dr. Joseph Carl Robnett Licklider formulated the earliest ideas of global networking in a series of memos discussing an āIntergalactic Computer Network.ā Both well-liked and well-respected, he demonstrated an amazing prescience many times over. His original and far-sighted ideas outlined many of the features the Internet offers today: graphical computing, user-friendly interfaces, digital libraries, e-commerce, online banking, and cloud computing.
In 1963, while he was serving as director at the U.S. Department of Defense Advanced Research Projects Agency (ARPA), it was Dr. Lickliderās persuasive and detailed description of the challenges to establishing a time-sharing network of computers that ultimately led to the creation of the ARPAnet. His 1968 paper called āThe Computer as a Communication Deviceā illustrated his vision of network applications and predicted the use of computer networks for communications. Until then, computers had generally been thought of as mathematical devices for speeding up computations.
The book weaves together beautifully the stories, people, beliefs, politics and breakthroughs he was associated with.
From this Quora answer by Alan Kay (emphasis mine):
A good book (pretty much the only good book) to read about the research community that Parc was a part of is āThe Dream Machineā by Mitchell Waldrop. There you will find out about the ARPA (before the āDā) IPTO (Information Processing Techniques Office) set up in 1962 by the visionary JCR Licklider, who created a research community of 15 or 16 āprojectsā, mostly at universities, but also a few at places like RAND Corp, Lincoln Labs, Mitre, BBN, SDC, etc.
There was a vision: āThe destiny of computers is to become interactive intellectual amplifiers for everyone in the world pervasively networked worldwideā.
A few principles:
- Visions not goals
- Fund people not projects ā the scientists find the problems not the funders. So, for many reasons, you have to have the best researchers.
- Problem Finding ā not just Problem Solving
- Milestones not deadlines
- Itās ābaseballā not āgolfā ā batting .350 is very good in a high aspiration high risk area. Not getting a hit is not failure but the overhead for getting hits. (As in baseball, an āerrorā is failing to pull off something that is technically feasible.)
- Itās about shaping ācomputer stuffā to human ends per the vision. Much of the time this required the researchers to design and build pretty much everything, including much of the hardware ā including a variety of mainframes ā and virtually all of the software needed (including OSs and programming languages, etc.). Many of the ARPA researchers were quite fluent in both HW and SW (though usually better at one than the other). This made for a pretty homogeneous computing culture and great synergy in most projects.
- The above goes against the commonsense idea that ācomputer people should not try to make their own tools (because of the infinite Turing Tarpit that results)ā. The ARPA idea was a second order notion: āif you can make your own tools, HW and SW, then you must!ā. The idea was that if you are going to take on big important and new problems then you just have to develop the chops to pull off all needed tools, partly because of what ānewā really means, and partly because trying to do workarounds of vendor stuff that is in the wrong paradigm will kill the research thinking.
- An important part of the research results are researchers. This extends the ābaseballā idea to human development. The grad schools, especially, generally admitted people who āseemed interestingā and judgements werenāt made until a few years down the road. Many of the researchers who ultimately solved most of the many problems of personal computing and networking were created by the ARPA community.
Behind the “mother of all demos”
One example of the project Lick funded was Douglas Englebart’s “mother of all demos” (including hypertext, email, and the mouse):
once [Lick] had given him his first real fundingāand vigorously defended him to his higher-upsāEngelbart, with his group, would go on to invent the mouse, on-screen windows, hypertext, full-screen word processing, and a host of other innovations. Engelbartās December 1968 presentation at a computer meeting in San Francisco would blow about a thousand minds at onceāand later be remembered as one of the turning points in computer history, the moment when the rising generation of computer professionals at last began to understand what interactive computing could do. By no coincidence, this was also the rising generation whose members had had their graduate educations supported by Lick and his successors at the Pentagonāand a talented portion of which would soon be gathering at PARC, the Xerox Corporationās legendary Palo Alto Research Center.
(Fun fact: Stewart Brand, who I am also a huge fan of, was helping Englebart behind the scenes during it!)
The Legendary Norbert Weiner
Yet when he went back to public school two years later, he was seven years ahead of his age groupāthus his entry into nearby Tufts University at eleven. In 1913 he received his Ph.D. in mathematics from Harvard, at age eighteen. From then until 1917, when the United States entered World War I, he did postdoctoral work at Cornell, Columbia, Cambridge, Gƶttingen, and Copenhagen universities, studying with such figures as the philosopher-mathematician Bertrand Russell and the great German mathematician David Hilbert. In 1918, as a clumsy, hopelessly nearsighted, but intensely patriotic twenty-four-year-old, he put his mathematical skills to use as an army private at the Aberdeen Proving Ground in Maryland, where he calculated artillery trajectories by hand. And in 1920, after briefly working as a journalist for the Boston Herald to tide himself over between jobs, he joined the mathematics faculty at MIT. It was not a prestigious appointment. MITās transformation still lay in the future, and the mathematics department existed mainly to teach math to the engineering students. The school wasnāt oriented toward research at all. However, no one seems to have informed Wiener of that fact, and his mathematical output soon became legendary. The Wiener measure, the Wiener process, the Wiener-Hopf equations, the Paly-Wiener theorems, the Wiener extrapolation of linear times series, generalized harmonic analysisāhe saw mathematics everywhere he looked. He also made significant contributions to quantum theory as it developed in the 1920s and 1930s. Moreover, he did all this in a style that left his more conventional colleagues shaking their heads. Instead of treating mathematics as a formal exercise in the manipulation of symbols, Wiener worked by intuition, often groping his way toward a solution by trying to envision some physical model of the problem. He considered mathematical notation and language to be necessary evils at bestāthings that tended to get in the way of the real ideas.
[…]
Legend has it that he was once wandering along a hallway, distractedly running his finger along one wall, when he encountered the open doorway of a classroom. Lost in thought, he followed his finger around the doorjamb, around all four sides of the room, and out the door again without ever once realizing that a lecture was in progress.
Claude Shannon’s Impact
And thatās why āA Symbolic Analysis of Relay and Switching Circuitsā is arguably the most influential masterās thesis of the twentieth century: in it Claude Shannon laid the theoretical foundation for all of modern computer design, nearly a decade before such computers even existed.
[…]
Nonetheless, as Shannon would point out in his 1948 paper, the separation of information and meaning did have the virtue of putting the interpretation of meaning where it belonged: in the brains of the people sending and receiving the message. The engineersā job was merely to get the message from here to there with a minimum of distortion, whatever it might say. And for that purpose, the digital definition of information was ideal because it allowed for a precise mathematical analysis via questions such as, What are the fundamental limits of a given communication channelās carrying capacity? How much of that capacity can be used in practice? How much is it degraded by the inevitable presence of noise in the line? What are the best and most efficient ways to encode the information for transmittal in the presence of noise?
[…]
Just a few years earlier, with his masterās thesis on logic and switching, Shannon had laid the foundation for all of modern computer circuitry; now, working on his own in the middle of the war, he had quietly done the same for most of the rest of the modern digital world. Ultimately his fundamental theorem explains how, for example, we can casually toss around compact discs in a way that no one would have ever dared do with long-playing vinyl records: error-correcting codes inspired by Shannonās work allow the CD player to eliminate noise due to scratches and fingerprints.
Shannonās theorem likewise explains how error-correcting computer modems can transmit data at the rate of tens of thousands of bits per second over ordinary (and relatively noisy) telephone lines. It explains how NASA scientists were able to get the Voyager spacecraftās imagery of the planet Neptune back to Earth across two billion miles of interplanetary space. And in general, it goes a long way toward explaining why today we live in an increasingly digital worldāand why the very word digital has become synonymous with the highest possible standard in data quality and reliability.
The legendary Von Neumann
In the single year of 1927, for example, while still a mere instructor at the University of Berlin, von Neumann had put the newly emerging theory of quantum mechanics on a rigorous mathematical footing; established new links between formal logical systems and the foundations of mathematics; and created a whole new branch of mathematics known as game theory, a way of analyzing how people make decisions when they are competing with each other (among other things, this field gave us the term āzero-sum gameā). Indeed, Janos Neumann, as he was known in his native Budapest, had been just as remarkable a prodigy as Norbert Wiener. The oldest of three sons born to a wealthy Jewish banker and his wife, he was enthralled by the beauty of mathematics from the beginning. He would discuss set theory and number theory on long walks with his childhood friend Eugene Wigner, himself destined to become a Nobel Prizeāwinning physicist.
Once, when he noticed his mother pause in her crocheting and gaze contemplatively into space, Janos asked her, āWhat are you calculating?ā His own powers of mental calculation were legendary. A certain renowned mathematician once struggled all night to solve a problem with the help of a desk calculator; the next morning von Neumann solved the same problem in his headāin six minutes. His memory was equally prodigious. Asked to quote the opening chapter of A Tale of Two Cities, which he had read many years before, von Neumann continued to recite without a pause or a mistake for more than ten minutesāuntil his audience begged him to stop. He was likewise gifted in languages: by the age of six he was making jokes with his father in classical Greek. And he had a knack for fitting in with the culture around him. In Budapest his family had been prosperous enough to earn the Hungarian honorific āMargattaiā from the Austro-Hungarian emperor, so in Gƶttingen and Berlin, he transmuted both name and honorific into an aristocratic-sounding āJohannes von Neumann.ā And then in 1930, when the lack of tenured professorships in Europe led him to immigrate to Princeton University in the United States, he quickly became Johnny von Neumann, a cheerful, party-going lover of funny hats, jokes, puns, and limericks, the racier the better. The Americans responded in kind. Von Neumannās brilliance was obvious to everyone; it took him precisely one year to achieve tenure at Princeton.
Edward Teller:
“For all of that, however, von Neumann remained a member of the AEC until the last. Indeed, the dying man was so central to the nationās nuclear-weapons program that he could be attended only by air force orderlies with top-secret security clearance, as there was considerable concern that his pain and mental distraction might lead him to babble classified information. Years later AEC chairman Lewis Strauss told of one final meeting at Walter Reed Hospital near the end: āGathered around his bedside and attentive to his last words of advice and wisdom were the Secretary of Defense and his Deputies, the Secretaries of the Army, Navy and Air Force, and all the military Chiefs of Staff. … I have never witnessed a more dramatic scene or a more moving tribute to a great intelligence.ā
What to ask your kids when you come home from work
āWhat have you done today that was altruistic, creative, or educational?ā
The Bit
But in the meantime he had considerably better luck with another word. Over lunch one day in late 1946, a group of Bell Labs researchers were grousing about the awkwardness of the term ābinary digitā and deploring the lack of any good substitute (existing proposals included hybrids such as binit and bigit, both considered loathsome). But then the statistician John Tukey joined the discussion. āWell,ā he asked with a grin, āisnāt the word obviously bit?ā And it was.2 Shannon liked the new word so much that he started using it himself and gave Tukey the credit in his 1948 paperāwhich, according to the Oxford English Dictionary, marked the first time
The Vision
When Lick later spoke about the power of computers to transform human society on an epic scale, about their potential to create a future that would be āintellectually the most creative and exciting [period] in the history of mankind,ā he sounded very much like Norbert Wiener in prophet mode. Lick likewise echoed Wiener as he worried about technologyās potential for harm: āIf all the Industrial Revolution accomplished was to turn people into drones in a factory,ā he was sometimes heard to say, āthen what was the point?ā Indeed, Lickās entire later career in computers can be seen as a thirty-year exercise in the human use of human beings, an effort to eliminate mind-numbing drudgery so that we could be free to use our full creative powers.
Steering People
Indeed, Lick was already honing the leadership style that he would use to such effect a decade later with the nationwide computer community. Call it rigorous laissez-faire. On the one hand, like his mentor Smitty Stevens, Lick expected his students to work very, very hard; he had nothing but contempt for laziness and no time to waste on sloppy work or sloppy thinking. Moreover, he insisted that his students master the tools of their craft, whether they be experimental technique or mathematical analysis. On the other hand, Lick almost never told his students what to do in the lab, figuring that it was far better to let them make their own mistakes and find their own way. And imagination, of course, was always welcome; the point here was to have fun.
[…]
But what if he got hit by a bus? No, Lick realized, if this vision was ever going to outlast his tenure at ARPA, he would somehow have to forge all these groups into a self-reinforcing, self-sustaining community. Putting MIT to work on the summer study had been one big step in that direction. And by the spring of 1963, he had taken another step by arranging to meet periodically with the leaders of all the groups he was underwritingāpeople such as Fano, McCarthy, Uncapher, Engelbart, Feigenbaum, and Perlis, who were known as the principal investigators, or PIs. ā[The idea was that] we would get our gang together frequently and have special Computer Society meetings,ā Lick said. āThere would be lots of discussion, and we would stay up late at night, and maybe drink a little alcohol and such. So we would have one place interact with another place that way.ā But Lick also had the kernel of an even better idea, if they could ever get it working. Early in his tenureāon Thursday, April 25, 1963āhe laid it out for the PIs in a long, rambling memorandum that he dictated just before he rushed off to catch an airplane. ā[To the] Members and Affiliates of the Intergalactic Computer Network,ā he began
[…]
Then, too, the faculty members were being their usual helpful selves. āIn a place like MIT, nobody, but nobody, will tell you that youāre doing a good job,ā says Dertouzos. āLick was the only one who understood the loneliness, the only one who would come to me, regularly, after every meeting, and say, āMike, youāre doing a great job.ā So he gave me a kind of paternal reinforcement, a sense of someoneās having faith in me.ā
Predominantly women writing air-defence software
Indeed, it soon became all too clear that the air-defense software was going to need something like two thousand programmers, which presented a problem, to put it mildly. First, MIT had no desire to put so many people on the Lincoln Lab payroll for a single project. One day the air-defense programming would be finished, and then what would they do? IBM had much the same reaction, as did Bell Labs, another subcontractor. The upshot was that the programming responsibility, along with many of the original Lincoln Lab programmers, were eventually transferred to Santa Monica and the RAND Corporationās system development division, which in December 1956 would break away and become the independent Systems Development Corporation. Second, in the early 1950s there probably were no more than a few thousand programmers in the whole country.
So the SAGE project soon found itself in the business of mass education. Special programming courses were set up at MIT, IBM, and RAND, and people from every walk of life were invited to enroll. The trainers quickly discovered that it was impossible to predict who their best pupils would beānot even professional mathematicians were a sure bet; they often lost patience with the detailsābut it was very easy to spot the talented ones once they got started. As a general rule of thumb, for example, music teachers proved to be particularly adept. And much to the project leadersā astonishment (this being the 1950s) women often turned out to be more proficient than men at worrying about the details while simultaneously keeping the big picture in mind. One of the projectās best programming groups was 80 percent female.
Taking a risk
Eighty-five percent!? That figure did more than give Lick pause. It seems to have hit him with the force of a religious epiphany: our minds were slaves to mundane detail, and computers would be our salvation. We and they were destined to unite in an almost mystical partnership: thinking together, sharing, dividing the load. Each half would be preeminent in its own sphereārote algorithms for computers, creative heuristics for humans. But together we would become a greater whole, a symbiosis, a man-machine partnership unique in the history of the world: āThe hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.ā However he got there, this was the watershed of Lickās life, the Great Idea that was the culmination of all that had gone before and the vision that would guide him from then on. Granted, it would be quite a while before he could articulate this notion as clearly as he could feel it. But what he did know, and very quickly, was that the TX-2 was it. Once heād had a few sessions at the console with Wes Clark to learn how the machine worked, he said, āI saw that I had really got to do that.ā But sadly, he added, āI also saw that I was not going to do that trying to build a psychology department at MIT.ā This is the part that still has Lickās colleagues from the psychology years shaking their heads in bewilderment. Frustration with MIT? Sureāwho wouldnāt be frustrated with Dean Burchard? But to give up a career heād spent twenty years building, right at the moment when cognitive psychology was taking off? To give up tenure? For computers? It was beyond belief.
Alan Kay
Vinton Cerf, Patrick Winston, John Warnock, Danny Cohen, Bob Balzerāthe roster of that meeting would eventually read like a whoās who of modern computing, and many of the friendships they forged in the cornfields would endure down to the present day. But for those three days in the Illinois summer, they were young, wild, and crazy. And by all accounts the wildest of the bunch was Utahās Alan Kay, a guy who was so far out in the future that not even this crowd could take him seriouslyāyet so funny, so glib, and so irrepressible that they listened anyway. When it was his turn to give a presentation, Kay told them about his idea for a āDynabook,ā a little computer that you could carry around in one hand like a notebook. One face would be a screen that would display formatted text and graphics and that you could draw or write on as if it were a pad of paper. The Dynabook would communicate with other computers via radio and infrared. It would be so simple to program that even a kid could do it. And it would have lots and lots of storage insideāmaybe even a hard disk!
[…]
Thus, wrote Kay, āin early 1967 he introduced me to Ed Cheadle, a friendly hardware genius at a local aerospace company who was working on a ālittle machine.ā It was not the first personal computerāthat was the LINC of Wes Clarkābut Ed wanted it for non-computer professionals.ā In retrospect, of course, Cheadle was about ten years ahead of his time. But why let a little thing like technical infeasibility stop you? Kay and Cheadle promptly entered into a very pleasant collaboration on what they called the FLEX machine, a prototype desktop computer that would feature a (tiny) graphics screen as well as Kayās first attempt at an easy-to-use, object-oriented programming language. Shortly after that, Kay recounted, even as he and Cheadle were pondering how to achieve their goals, given the ālittle machineāsā severe lack of horsepower, the Utah group received a visit from Doug Engelbart, āa prophet of Biblical dimensions.ā Engelbart gave them a progress report on NLS, and once again, Kay said, it was a revelation. Hypertext, graphics, multiple panes, efficient navigation and command input, interactive collaborative work, Engelbartās whole notion of augmenting human intellectātaken together, they made for an electrifying vision of what computing ought to be. But that just led to an even more electrifying thought.
Two years earlier, said Kay, he had read (and promptly forgotten) Gordon Mooreās original article about the exponentially falling price of integrated circuits. But now Mooreās insight came rushing back, and āfor the first time,ā Kay wrote, āI made the leap of putting the room-sized interactive TX-2 … on a desk. I was almost frightened by the implications; computing as we knew it couldnāt possibly surviveāthe actual meaning of the word changed.ā Instead of a world in which there existed at most a few thousand mainframe computers, all controlled by large institutions, Kay could now envision a world boasting computers by the millions. Computers under the control of no one but their owners. Personal computers. āIt must have been the same kind of disorientation people had after reading Copernicus,ā Kay wrote, ā[when they] first looked up from a different Earth to a different Heaven.ā
[…]
āIn the history of art,ā says Alan Kay, āthe most powerful work in any genre is done not by the adults who invent it, but by the first generation of kids who grow up in it. Think of perspective painting during the Renaissance. Well, we were that generation. We were the kids whoād had our Ph.D.s paid for by ARPA.ā
Al Gore & the National Science Foundation
In the midst of all this, meanwhile, NSF was getting penciled in for an even wider role, courtesy of the junior senator from Tennessee. As chairman of the Senate Subcommittee on Science, Technology, and Space, Albert Gore, Jr., was widely acknowledged as one of the very few American politicians who took the time to understand technology; he had even been known to show up at meetings of the National Academy of Sciences from time to time, just to listen to the technical sessions. He had been an enthusiastic supporter of NSFās supercomputer centers, and now he was even more intrigued by this notion of digital networks as a continent-spanning infrastructure. (Heād been mulling over that notion for some time, apparently: earlier in the decade he had coined the phrase āInformation Superhighway,ā as an analogy to the Interstate Highway System that his father had helped create back in the 1950s, when he was in the Senate.) So in 1986 Gore wrote legislation asking the administration to study the possibility of networking the supercomputer centers with fiber optics.
[…]
in due course, with NSFās providing the seed money, the regional networks took shape, all operating as not-for-profit Internet access providers to the research community. However, he says, āwe told them, āYou guys will eventually have to go out and find other customers. We donāt have enough money to support the regionals forever.ā So they didācommercial customers. We tried to implement an NSF Acceptable Use Policy to ensure that the regionals kept their books straight and to make sure that the taxpayers werenāt directly subsidizing commercial activities. But out of necessity, we forced the regionals to become general-purpose network providers.ā Or as they would soon be called, Internet Service Providers. Vint Cerf, for one, is still lost in admiration for this little gambit. āBrilliant,ā he calls it. āThe creation of those regional nets and the requirement that they become self-funding was the key to the evolution of the current Internet.ā
Impact
Technology isnāt destiny, no matter how inexorable its evolution may seem; the way its capabilities are used is as much a matter of cultural choice and historical accident as politics is, or fashion. And in the early 1960s history still seemed to be on the side of batch processing, centralization, and regimentation. In the commercial world, for example, DEC was still a tiny niche player, a minor exception that proved the rule. Almost every other company in the computer industry was following Big Blueās leadāand IBM had just made an unshakable commitment to batch processing and mainframes, a.k.a. System/360. In the telecommunications world, meanwhile, AT&T was equally committed to telephone-style circuit switching; its engineers would scoff at the idea of packet switching when Paul Baran suggested it a few years later, and they would keep on scoffing well into the 1980s. And in the academic world, no other agency was pushing computer research in the directions Lick would, or funding it at anything like the ARPA levels.
Remember, says Fernando Corbató, āthis was at a time when the National Science Foundation was handing out money with eye droppersāand then only after excruciating peer review.
Compared to that, Lick had a lot of money. Furthermore, he was initially giving umbrella grants, which allowed us to fund the whole program. So there was this tremendous pump priming, which freed us from having to think small. The contrast was so dramatic that most places gravitated to ARPA. So that opening allowed a huge amount of research to get done.ā
Without that pump primingāor more precisely, without an ARPA animated by J. C. R. Lickliderās visionāthere would have been no ARPA community, no Arpanet, no TCP/IP, and no Internet. There would have been no Project MACāstyle experiments in time-sharing, and no computer-utilities boom to inflame the imagination of hobbyists with wild speculations about āhome information centers.ā There would have been no life-giving river of cash flowing into DEC from the PDP-10 time-sharing machines it sold to the ARPA community. There would have been no windows-icons-mouse interface Ć la Doug Engelbart. And there would have been no creative explosion at Xerox PARC.
š About “Highlights”
I’m trying to write up raw notes/highlights on (good) books I’ve recently finished (for lack of having time to write proper reviews). This pushes me to reflect a bit on what I’ve learned and have notes to go back to. It may also be of use to you, Dear Reader, if you are curious about the book! š

