George Dyson 2012. Turing’s Cathedral. The origins of the digital universe. New York: Pantheon Books, 401 pp.
A rich and detailed account of the building of ENIAC, the Electronic Numerical Integrator and Computer. But also a history of the first years of the Institute for Advanced Study at Princeton, and of John von Neumann and his colleagues, who, because they invented digital computing, also had to invent programming, or at least were more or less independent developers, depending on where one places Babbage, Lovelace and Turing. They were surely the first hackers, and Dyson’s book is plenty detailed enough to enjoy reading the first steps towards developing operating systems and interpreters, as von Neumann and others sought to speed up the rewiring that was entailed to create the original programs.
There are many enduring conclusions from this wonderful book. For example, when Nazi Germany purged universities of their best academics it ensured the military, technological and economic dominance of the USA henceforth (or at least up until now; a cabal of right-wing conservatives are doing their best to send the next scientific revolution elsewhere). The mathematicians and physicists who were quick enough to escape Europe (Albert Einstein, John von Neumann, Kurt Gödel, Edward Teller, Stanislaw Ulam, Eugene Wigner) were welcomed as refugees at Princeton and built the first digital computer in order to perform calculations verifying the design of the atomic bombs. These people, many Poles or Hungarians, had seen the holocaust destroy their families, friends, universities and cities, and were not short of motivation, in some cases (Teller) even after the bombs were used on Hiroshima and Nagasaki. In so doing, they created the USA’s nuclear weapon dominance, and also led directly to the technological and economic dominance of IBM etc. and silicon valley. (If one were to set up an equivalent to the Institute for Advanced Study now, would it be so easy to choose key researchers? Geneticists, physicicts and researchers studying the brain, consciousness and artificial intelligence would have to be vital. But who? The Francis Crick Institute seems one attempt at a modern equivalent, and good luck to them.)
Refugee academics are not the only pricipals in this book; Abraham Flexner and Oswald Veblen had arguably the most formative role of all, founding and funding the Institute for Advanced Study and recruiting this academic dream team. Thanks to visionary leadership and philanthropy there seemed no problem recruiting others who rapidly became famous as well: Freeman Dyson, Richard Feyneman, Claude Shannon. And Alan Turing himself, who also visited. And who better than Freeman Dyson’s son George Dyson to write the history? He grew up at Princeton among these remarkable people.
Another enduring message from this book is that the scientists who concieved of ENIAC and first implemented digital computing knew very well the wider and longer term implications of what they were doing. Documents unearthed by George Dyson show that John von Neumann and others saw many of the places that digital computing would lead: simulation, virtualisation, modelling, computer viruses, AI.
Although ENIAC was sold to the army as a way of completing the calculations for making atomic bombs, it was also used for other projects, especially after hours, notably meteorology (initiated by Lewis Fry Richardson), the Monte Carlo method of random simulations to model distributions in statistics (by Nicholas Metropolis) and artificial life simulations (by Nils Aals Baricelli). There is no doubt that the universal applications and impact of their ideas and their technology on the future of humanity was very much in the mind of John von Neumann and his colleagues.
They even wondered “is time, space and the universe fundamentally digital?” (as asserted by Anaximander, Gottfried Wilhelm Leibniz and quantum loop gravity theoreticians such as Carlo Rovelli). Kurt Gödel, at least, apparently thought so. Gödel was perhaps the person among all these thinkers who best deserves the adjective original, and he spent most of his final years researching the writings of Leibniz who pioneered binary arithmetic and the idea that the universe itself was digital (“monadology”). Few shared Gödel’s views, then or now, but Leibniz did have a medallion struck in 1697 illustrating the creation of all things by means of binary arithmetic. It seems the “we are in a simulation” argument goes way back.
A great book, about a small group of remarkable scientists and engineers who changed the course of history of science and of the West. Richly documented with endnotes, mostly from unpublished sources from the archives of the Institute for Advanced Study and from the personal documents of the protagonists.
(These notes don’t make mention of the engineers such as Julian Bigelow, nor of mathematician turned administrator Herman Goldstine, who built ENIAC according to von Neumann’s design. Nor do these notes focus on John von Neumann nearly as much as the book does. George Dyson emphasises how important John von Neumann was in gathering and leading the team; after his death in 1957 the team rapidly dispersed and other groups, and IBM, led further developments in computing. Subsequent technical developments greatly speeded up processing but serial processing, memory addressing etc remain largely unchanged from John von Neumann’s original design.)