Jump to ratings and reviews
Rate this book

Turing's Cathedral: The Origins of the Digital Universe

Rate this book
“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

505 pages, ebook

First published January 1, 2012

Loading interface...
Loading interface...

About the author

George Dyson

55 books125 followers
George Dyson is a scientific historian, the son of Freeman Dyson, brother of Esther Dyson, and the grandson of Sir George Dyson. When he was sixteen he went to live in British Columbia in Canada to pursue his interest in kayaking and escape his father's shadow. While there he lived in a treehouse at a height of 30 metres. He is the author of Project Orion: The Atomic Spaceship 1957-1965 and Darwin Among the Machines: The Evolution of Global Intelligence, in which he expanded upon the premise of Samuel Butler's 1863 article of the same name and suggested coherently that the internet is a living, sentient being. He is the subject of Kenneth Brower's book The Starship and the Canoe. Dyson was the founder of Dyson, Baidarka & Company, a designer of Aleut-style skin kayaks, and he is credited with the revival of the baidarka style of kayak. (from Wikipedia)

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
1,151 (25%)
4 stars
1,338 (29%)
3 stars
1,315 (28%)
2 stars
527 (11%)
1 star
215 (4%)
Displaying 1 - 30 of 474 reviews
Profile Image for Jenny Brown.
Author 5 books52 followers
August 14, 2012
This book is fatally marred by Dyson's failure to understand computer architecture. I note many reviewers assuming that they are confused because they are math phobic. But I was a programmer in the late 1970s and 1980s. I wrote in Assembly language and have read machine language (in hex) when debugging, so when I read Dyson's long passages of gibberish purporting to describe what is going on in a computer I knew they were just plain gibberish.

The stories about the people involved in the project were very interesting, as was the description of the environment--the IAS--and its politics, but Dyson's failure to explain the most important technical concepts to the reader (or to understand them himself) limit the books usefulness. These concepts are not all that complex. They were explained to a class of recent high school graduates in Tennessee back when I took my first computer class in 1979 and we all wrote a simplified machine language program for our final exam. So there isn't any reason that the intelligent people this book was designed for couldn't have had the architecture of a Van Neumann machine explained in more depth--in a way that would have made sense to them.

I found this book saddening because it made me really want to understand the technology whose description it butchers. I'd really love to know more about how the engineers built these early computers and how they differ from the ones in use today. I gleaned many facts from the mishmash presented here, for example that CRTs were used in place of what was later called "core" memory in slightly later computers.

But Dyson clearly doesn't understand what a register is, nor does he understand why shifting bits would do arithmetic and reading the same bits in a CPU provides the machine with its program. For that matter, he doesn't seem to understand what a program is. A simple explanation of what a computer algorithm is and why 17 instructions can lead to thousands of operations being performed would make long tracts of this book make sense, which they currently don't.

But statements like one claiming that the CRT screen on a PC represented a CPU buffer were sheer imbecility. (The CPU on that computer functions as an output device, similar to the tape or punched paper on the early computers.) And there were dozens more of these uninformed statements.

Bottom line: If you read this book and are confused you are getting the authors message, since he, too, was terminally confused by the technology he was attempting to describe.
Profile Image for Lori.
308 reviews99 followers
April 29, 2019
I loved the history. Dyson’s enthusiasm and love for the subject and scientists comes through loud and clear. It’s rich in detail on researchers with emphasis on John von Neumann.

As for speculation on the future of a living machine, I think the then ninety-one-year-old Edward Teller gave sound advice.
”That seems reasonable,” I agreed. “My own personal theory is that extraterrestrial life could be here already … and how would we necessarily know? If there is life in the universe, the form of life that will prove to be most successful at propagating itself will be digital life; it will adopt a form that is independent of the local chemistry, and migrate from one place to another as an electromagnetic signal, as long as there’s a digital world—a civilization that has discovered the Universal Turing Machine—for it to colonize when it gets there. And that’s why von Neumann and you other Martians (the five Hungarian “Martians”: John von Neumann, Theodore von Kármán, Leo Szilard, and Eugene Wigner) got us to build all these computers, to create a home for this kind of life.”

There was a long, drawn-out pause. “Look,” Teller finally said, lowering his voice to a raspy whisper, “may I suggest that instead of explaining this which would be hard … you write a science-fiction book about it.”

“Probably someone has,” I said.

“Probably,” answered Teller, “someone has not.”


***************************************

https://www.nytimes.com/2012/05/06/bo...
Profile Image for BlackOxford.
1,095 reviews69k followers
July 3, 2020
Knowledge To Kill For

This is not your average paean to the pioneers of the high-tech industry. Who knew, for example, that Turing’s insight had to overcome two centuries of mathematical obsession with Newton’s (but not Leibniz's) infinitesimal calculus? And who knew that the development of the first digital computers was triggered by the military drive to create the hydrogen bomb? And who knew that the victory of binary arithmetic would be ensured by molecular biology? Certainly not me, and I suspect a number of other ignorant sods who presumed that this industry ‘just happened’, like milk suddenly appearing on the supermarket shelves with no clue about its origins in muck and mud.

Dyson, a son of the manse so to speak (son of Freeman Dyson, brother of Esther Dyson, and the grandson of Sir George Dyson), can be as concise as he is illuminating: “Three technological revolutions dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA.” When these events are considered together rather than as independent strands of modern science, it becomes clear that nothing in our lives almost 70 years later is unconnected to war and the organisation for war provoked directly by the Second World War (and indirectly by the First). The American President Eisenhower’s concerns about the ‘military-industrial complex’ were proven justified not just about the defence industry but also about a new global society built upon inherently lethal knowledge.

The sources of this lethal knowledge were places like the Aberdeen Proving Grounds in Maryland, the Los Alamos compound in New Mexico, and the Institute for Advanced Research In Princeton, New Jersey. These were modern monastic establishments whose existence was justified not by prayer but by thought, largely mathematical, and not by the construction of physical edifices but the creation of weapons of destruction. These were the forerunners of what would later be known as ‘think tanks’ and ‘skunk works,’ organisational entities devoted exclusively neither to economic success nor industrial productivity but technological innovation that would facilitate mass killing.

These new centres of thought were not isolated academic enclaves. They did assemble and concentrate the best intellects and coordinated their collective efforts in highly abstruse areas. But they also set agendas for university (and even high school) scientific education, successfully lobbied government about the priorities for military research spending, and shaped the interests of the most important private foundations that funded research from medicine to astrophysics.

Because they had no factories, no significant labour force, and no immediately commercial products, these establishments engaged in a sort of parallel politics. Although they were the driving force of the new military industrial complex, they were functionally invisible, in part because their work was confidential, but mostly because no one outside them could really understand what they were up to. They effectively constituted an independent empire of the mind, a Platonic haven of pure rationality, or at least what military requirements implied as rationality.

Most of the men (and they were almost all men) recruited into these establishments as thinkers or administrators were undoubtedly exceptionally clever in their respective fields. However, it is clear from the personal and institutional biographical detail which Dyson provides that very few of them would have achieved their ‘potential’ without this new form of scientific organisation. It is likely that they would have spent their lives in interesting but inconclusive research in dispersed academic institutions, or teaching Latin to high school seniors. The legendary names - Shannon, von Neumann, Ashby, Wiener, Mandelbrot, etc - would probably have been known but not with anything like the cultural force that they now have. These new organisations were intellectual king-makers.

So these military/intellectual enterprises, dedicated to refining the efficiency of human conflict, have transformed scientific culture. The concentration of intellectual talent, money and professional dominance means that there is only one path to scientific innovation - national defence, however widely that might be defined. Subsequent commercialisation, organised on similar lines in the Silicon Valleys and University Science Parks of the world, are functional subsidiaries of an invisible network, which few of us know anything about except when some ‘breakthrough’ (or breakdown) is announced in Wired or featured in Fast Company.

My lifetime is almost exactly contemporaneous with the digital epoch (Von Neumann died on my 10th birthday; Steve Jobs had just turned 2; Gates had just begun to walk). The presumptions, intentions, and fallacies of this epoch are things I share intellectually and emotionally with my generational cohort. This is Turing’s Cathedral, a cultural state of mind rather than a physical edifice. It took substantially less time to build than its medieval version. But its cultural influence is at least as great. Whether it will maintain itself as durably or with continued centrality is an open question, the answer to which seems to depend upon our fundamental but repressed attitude toward the god of war.
Profile Image for Bradley.
Author 4 books4,393 followers
August 17, 2018
I had an issue with this non-fiction, but also a whole lot of love.

So this is about the mathematicians who heralded the whole computer movement. You know, the OTHER, more disreputable and crazy smart people like Von Neumann, Gödel, and all the other nutters like Turing who ushered in the computer age from just a thought experiment into a hand-made lab and later into the co-authors of the nuclear age.

Yeah. THOSE crazy nutters. The ones that ran enough physics programs on their automatic machines to model nuclear explosions and bring about the bomb. Computers, and not the poor women (and a few men) who got paid to crunch math by hand for years, are the real reason we have the nuclear age. And also why we have genetic sciences.

Pretty obvious, I know, but still, these guys are some unsung heroes. Just programmers. Sheesh. Whatever.

The book is full of love. I love the people. And then there was a wholly appropriate section expounding on science fiction and the future of AIs and I LOVED that, too, especially the form a realistic alien might take.

So what issues did I have?

WAY too much time was spent on the schools. Early schools, history, blah blah blah. Sure. Colleges are important and such, but I lost my caring factor until a while after we were introduced to Von Neumann. And what an interesting guy he was! :)

A side issue I should have more problem with is the role of women in this non-fiction, but like real history, too much idiocy prevents half our population from having more active roles. I'm not too fond of how the women here were relegated to being facilitators, suicidal wives, or footnotes to Crick and Watson. But let's be real here. We have a horrible track record at pushing these people aside in reality, not just in history.

I can appreciate the minds SHOWN HERE while still wishing the other minds had a chance. It didn't diminish my fascination. I can have MORE fascination to spare elsewhere. :)

So. Maybe not the best non-fiction I've ever read, but I did learn a hell of a lot about the people who ushered in the computer age and it's quite a story. And honestly, it makes for a more realistic story than the others I've read that focused more on WWII encryption engines as the real focus and impetus for computers. Making nukes is pretty damn huge. And obvious. :)
Profile Image for Justin.
21 reviews13 followers
February 18, 2015
I might have easily given this book four stars if Dyson could have stuck to history instead of indulging himself in inane speculations, and commentaries that are sadly meant to sound profound. The connections he draws between completely unrelated aspects of technology and biology are so strained that whenever I read a particularly grievous one, I'm forced to put the book down and walk around the room until the waves of stupidity subside a bit. For example, at one point Dyson asks us to consider whether digital computers might be "optimizing our genetic code ... so we can better assist them." At another he explains the reason we can't predict the evolution of the digital universe is because algorithms that predicted airplane movements in WW2 had to be normalized to the reference frame of the target... or something? Throughout the entire book there's a complete disconnect between the technical nature of the things he describes and the vague abstractions that he twists into obscenely trite metaphors.
Dyson seems to live in some sort of science-fiction wonderland where every computer program is a kind of non-organic organism. He calls code "symbiotic associations of self-reproducing numbers" that "evolved into collector societies, bringing memory allocations and other resources back to the collective nest." They are active, autonomous entities which "learned how to divide into packets, traverse the network, correct any errors suffered along the way, and reassemble themselves at the other end." By the end of the book I'm not even sure if Dyson means this as a metaphor - he appears to genuinely believe that it's merely a matter of perspective.
The truth is, if every human died tomorrow and the internet was left to run from now to infinity, not a single advance would be made in the state of computing. The viruses would quickly burn themselves away, the servers would grind monotonously at their maintenance routines, and the Google webcrawlers would stoically trudge through every porn site on Earth, an infinite number of times.
Dyson might respond that programs integrate humans as a symbiotic part of their evolution, but in that case you could say the same thing about clothing, music, or furniture. In this light the IKEA franchise must be viewed as a great self-replicating organism, conscripting humans in the propagation of its global hegemony of coffee tables.
Profile Image for David Rubenstein.
822 reviews2,665 followers
November 28, 2012
Despite the title, this book is not primarily about Alan Turing. It is really about the group of people at the Institute of Advanced Studies at Princeton. Much of the book focuses on John von Neumann, who spearheaded the effort to build some of the earliest electronic computers. These first computers were very unreliable--incorrect results were as likely due to faulty vacuum tubes as coding errors. In fact, circuits had to be designed to be robust to vacuum tubes that did not follow specs.

Quite a large chunk of the book--and the most fascinating--dealt with the types of mathematical and physical problems that the earliest computers could solve. In fact, that was the principal interest of von Neumann--learning what types of problems could be solved using computers. Here, Alan Turing and Kurt Godel played a large role in defining what sorts of problems might be solvable.

Among the problems that the earliest computers attacked was weather forecasting. In the late 1940's, there was much controversy about the feasibility of numerically computing forecasts in principle. Of course, weathermen wanted to continue to use their gut feelings to forecast the weather, while some scientists thought that, given sufficient spatial resolution, the weather could be forecast far in advance. It was not until later that people like Lorenz discovered that there are fundamental limits to how far in advance weather can be forecast.

The earliest computers were also used for developing the atomic bomb. Many aspects of the physics were not solvable using direct means. Simulations using a brand new numerical method called "Monte Carlo" were extremely significant for solving them. For this method to work, random numbers are required for initiating independent simulated trajectories. But random numbers were not easy to come by, so a special program helped to develop algorithms for computing them.

This book goes into considerable depth, in describing the people who developed and used the first computers at the institute. There are fascinating descriptions of the mathematical, physics, and biological puzzles that were attempted. I recommend the book highly, for those interested in the history of numerical computation.

Profile Image for Warwick.
881 reviews14.8k followers
March 22, 2017

A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.

The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer, at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.

The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions.

‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’


What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.

How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They are intelligent processes.


All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.

The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.

My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually is – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.

A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package.
Profile Image for Eric_W.
1,933 reviews388 followers
July 28, 2015
If you are looking for information about Alan Turing, look elsewhere. The title is a metaphor.

The Nazis did the U.S. a huge favor with their boorish and stupid racial policies. Many prominent Jews were brilliant mathematicians and physicists, and when the “cleansing” of universities began by the Nazis, people like Van Neumann, Einstein, and many others fled to the United States where they were of immense assistance in the development of the atomic bomb.

This book is about the origins and development of the digital age and Dyson spends considerable space on the people and institutions key to that development. The Princeton Institute for Advanced Research, for example, under Abraham Flexner and Oswald Veblen, recruited many of these refugees who helped build the Institute into one of the premier research institutions. I suppose it all has special interest for me as my life span parallels the development of the computer. I was born in 1947. In the 7th grade I became fascinated by ham radio and electrons and studied the intricate workings of the vacuum tube, a device for which I still have some reverence. I’m still dismantling and messing with the insides of computers.

Ironically, given the book’s title, John Van Neumann takes center stage with Turing playing only a peripheral role. Van Neumann’s interest in digital computation was apparently sparked by reading Turing’s seminal article. “On Computational Numbers” that led him to the realization of the importance of stored program processing.

What Turing did that was so crucial was to take Gödel’s proof of the incompleteness theorem that permitted numbers to carry two meanings. Turing took that and thought up the paper tape computer that produced both data and code simultaneously. That realization alone was fundamental in providing the basic building block for the computer.

The builders had conflicting views of the incredible computational power they had unleashed that was to be used for both ill and good. Van Neumann recognized this: “ A tidal wave of computational power was about to break and inundate everything in science and much elsewhere, and things would never be the same.”

It would have been impossible to develop the atomic bomb without the computational abilities of the new “computers.” So naturally, the Manhattan Project is covered along with the influence of the evil Dr. Teller (I must remember to get his biography,) who was the character (Dr. Strangelove) brilliantly played by Peter Sellers. After the war, Teller pushed very hard for the development of the “super-bomb” even though he knew, or must have known, that his initial calculations were flawed because he didn’t have the computational power to do them completely. One number that I questioned was the Dyson’s reporting that when the Russians exploded a three-stage hydrogen bomb in 1961, the force released was equivalent to 1% of the sun’s power. That sounds wildly improbable. Anyone able to contradict number?

Some interesting little tidbits. One computational scientist refused to use the new VDTs, preferring to stick with punched cards (he obviously never dropped a box of them) which seemed far more tangible to him than dots on a screen. I guess fear of new technology is not reserved for non-scientists.

One of the major and very interesting questions addressed by Turing and reported on in the book is what we now call artificial intelligence. When we use a search engine are we learning from the search engine? or is the search engine learning from us? It would appear currently the latter may be true. Clearly, the search engines have been designed to store information and use that information to learn things about us both as a group and individually. I suspect that programs now make decisions based on that accumulation of knowledge. Is that not one definition of intelligence? (I will again highly recommend a book written and read quite a while ago that foresaw many of these issues: The Adolescence of P-1 by Thomas Ryan (1977)** . Note that Turing talked about the adolescence of computers and likening them to children.)

Some reviewers have taken Dyson to task for emphasizing abstract reasoning that went into the development of the computer while downplaying the role of electrical engineers (Eckert and Mauchly) in actually building the things. I’ll leave that argument to others, not caring a whit for who should get the credit and being in awe of both parties. On the other hand, the book does dwell more on the personalities than the intricacies of computing. There are some fascinating digressions, however, such as the examination of digital vs analog and how the future of computing might have been altered had Vann Neumann not tragically died so young as he had a great interest in biological computing and the relationship of the brain to the computer.

**For a plot summary of The Adolescence of P-1 see https://en.wikipedia.org/wiki/The_Ado...
Profile Image for George Kaslov.
103 reviews153 followers
July 27, 2023
Going by this book's title you would think it is about Alan Turing. Well, no.

This book is almost a biography of John von Neumann, almost a history of the MANIAC computer, almost the story of the beginning of the Computer Science field. All of these topics are connected but this book is edited in such a way that it seems as if 3 different books collided and were glued together with a lot of unnecessary detours and tangents. And that is a real shame, for there is a lot of good info in here.

I can still recommend this book to people interested in these topics, but I will have to warn them of these frustrations.
Profile Image for Ushan.
801 reviews70 followers
March 9, 2014
This book covers essentially the same material as William Aspray's 1990 John von Neumann and the Origin of Modern Computing, the life and times of John von Neumann and the IAS computer. Aspray's book is much more to the point, though, while Dyson's takes large detours into the history of the atomic and hydrogen bomb, World War II cryptography and the like - all these topics have better books dedicated to them. George Dyson has a personal connection to the Institute of Advanced Study because his father, famous physicist and mathematician Freeman Dyson, worked there in the 1950s, when the IAS computer was being constructed and operated, and George was a child. Dyson Jr. had the run of the Institute's archives but he did not fish any radically new discoveries out of them.

As in Dyson's books about Project Orion and the evolution of artificial intelligence, I wish that someone better educated had written on this topic. For example, Dyson mentions the von Neumann architecture without clearly defining, what it is. The ENIAC and the Colossus were early non-von Neumann-architecture computers whose programs were input using a plugboard and switches; a modern equivalent is the FPGA. A better book would make this distinction more explicit. Dyson also talks about digital versus analog computation. A digital computation represents different digits of a mathematical variable with separate physical variables, and an analog one represents them with different digits of the same physical variable; an abacus is digital and a slide rule is analog. It is not true that search engines and social networks are enormous analog computers! They are not fault tolerant: a single bug in the implementation of the core algorithms can wreck them both!
Profile Image for Peter Tillman.
3,741 reviews412 followers
August 24, 2020
An interesting and discursive early history of the electronic digital computer, around and after WW2, with a history of the Princeton NJ area back to colonial days, and the history of the founding of the Institute for Advanced Study there. The local history stuff you can safely skim or skip, the IAS stuff is also peripheral (but interesting). But when Dyson gets to John von Neumann's biography (chapter 4), the pace picks up, and picks up even more when he gets to the computer history. Von Neumann was a genius, and things move quickly in wartime. I love reading about bright engineers at work. Many of the parts for the first IAS computer were war-surplus and/or picked for easy troubleshooting. The academic mathematicians hated the engineers, and succeeded in getting the IAS program shut down in the 1950s. Too bad.

There are more more detours to come, but Dyson is a fine historian and did his homework. At his best, this is 5-star writing -- from hydrogen bombs to stellar evolution! -- but the odd structure and detours were distracting. 3.5 stars, rounded down for that.

For real reviews, I recommend Alan Scott's https://www.goodreads.com/review/show... and John Gribbin's https://www.goodreads.com/review/show... I expected to like this one more than I did, for the reasons Alan (and others) spell out.
Profile Image for JodiP.
1,063 reviews2 followers
September 5, 2012
This book started off rather confusingly--without a clear description of what it was to be about. It did not improve. For some reaosn,the author thought it very important to tell how Princeton was founded and had a lengthy chapter on William Penn from the 1600s. I thought it might be the format--I was reading on Kindle, and entertained the idea of getting the hard copy to flip through irrelevant sections. I then checked reviews and decided to give it a pass all together. Other folks have found it rambling as well. I didn't finish it.
Profile Image for Paul.
2,170 reviews
August 25, 2017
Three quarters of a century ago a small number of men and women gathered in Princeton, New Jersey. Under the direction of John von Neumann they were to begin building one of the world’s first computers driven by the vison that Alan Turing had of a Universal machine. Using cutting edge technology, valves and vacuum tubes to store the data, the first computer was born. This unit took 19.5kW to work and had a memory size of five, yes five kilobytes. It caused a number of revolutions, it was this machine that laid the foundations for every single computing device that exists on the planet today, it changed the way that we think about numbers and what they could do for us and the calculations that it ran gave us the hydrogen bomb…

I had picked this up mostly because of the title, Turing's Cathedral, thinking that it would be about that great man, the way that he thought and the legacy that he left us with regards to computing and cryptography. There was some of the on Turing and his collaboration with the American computer scientists and engineers through the war, but the main focus was on the development of the computer in America and the characters that were involved in the foundation of today’s technological society. Some parts were fascinating, but it could be quite tedious at times. There were lots and lots of detail in the book, the characters and political games that they were playing and subject to, not completely sure why we needed to go so far back in time on the origins of Princeton. Definitely one for the computer geek, not for the general reader.
116 reviews47 followers
November 3, 2012
The history of the universal electronic computer at the Institute of Advanced Studies in Princeton, pioneered by the leading genius of his time, John von Neumann, and driven largely by the computational requirements of building a nuclear bomb, makes for a good book. George Dyson’s Turing’s Cathedral is not that book.

At his best, Dyson writes compelling, erudite, witty, and idiosyncratic prose with a gift for poetic analogies and elegant turns of phrase. The opening of chapter XVII, on the vast computational power we have today and in the future, is a good example:

Von Neumann made a deal with “the other party” in 1946. The scientists would get the computers, and the military would get their bombs. This seems to have turned out well enough so far, because, contrary to von Neumann’s expectations, it was the computers that exploded, not the bombs.


Alas, these morsels are thinly spread.

Worse, many of them are abject nonsense.

Dyson seems to have no internal error correction mechanisms that shield him from stretching his analogies far beyond what their flimsy fabric can endure. He combines this infatuation with his own literacy with digital, mathematical, and technological illiteracy: a cavalier attitude towards the details of the technology that he aims to describe and reason about.

Search engines and social networks are analog computes of unprecedented scale. Information is being encoded (and operated upon) as continuous (and noise-tolerant) variables such as frequencies (of connection or occurrence) and the topology of what connects where, with location being increasingly defined by a fault-tolerant template rather than by an unforgiving numerical address. Pulse-frequency coding for the Internet is one way to describe the working architecture of a search engine, and PageRank for neurons is one way to describe the working architecture of the brain. These computational structures use digital components, but the analog computing being performed by the system as a whole exceeds the complexity of the digital code on which it runs. [p. 280]

(The chapter ends with a bold one-liner, “Analog is back, and here to stay.”) It’s unfair to cherry-pick particularly egregious passages from a book about a complicated subject, but the book is full of stuff like that. With the first few instances I tried to adopt a charitable stance and wanted to understand Dyson is trying to tell me, behind the noise of half-understood technical inaccuracies. But after a few chapters I just became annoyed.

Generously put, the technical passages of this book are inspiring, in the sense that I was inspired to actually find out how the ENIAC and other machines worked. Using other sources, such as Wikipedia, because Dyson’s book does very little to tell me. Dyson is clearly out of his depth here, and his confused and confusing descriptions read like the account of a blind man explaining the concept of colour. The result is dense, conceited, and just plain annoying. As Goodreads reviewer Jenny Brown puts it, “this book is fatally marred by Dyson’s failure to understand computer architecture.” Other reviewers of this book, both professional and amateur, seem to be appropriately humbled and impressed by the opaque technology, and write off their confusion to their own cognitive inadequacy. Here’s an example from judy’s review: “I stand in awe of the geniuses who envisioned and constructed the digital universe—largely because I haven't a prayer of understanding what they did. Although written in plain English, somehow my brain will simply not grasp the concepts.” Well, neither does Dyson’s.

Our message should be that computers are simple. Instead, we get yet another book that makes technology into magic, and its inventors into Byronic heroes.

Which leads us to the biographical sketches. The book gives us a rich anecdotes about many of the leading figures associated with Princeton’s computer group in the 40s and 50s. Bigelow, Ulam, Gödel, — all secondary to the book’s main character, the titanic John von Neumann. Many of these descriptions are entertaining and insightful, it’s clearly the best part of the book. Dyson tells much of the story in the voice of others, by quoting at length from interviews and biographies. This works well. However, even these sketches remain disjointed, erratic, meandering, and quirky. Dyson clearly has had access to unique sources at Princeton’s Institute of Advanced Studies, which make for the most interesting parts of the book. Examples include endlessly entertaining log messages from ENIAC programmers. On the other hand, I really don’t need to know that Baricelli had his $1,800 stipend was renewed in 1954. These random facts are many, obviously motivated by the author’s access to their sources, and never play a role in the greater narrative.

Even Alan Turing, after whom Dyson’s book is named, makes an appearance in chapter XIII. Otherwise the book has nothing to do with Turing, and little to do with universal computing, so the book’s title remains a mystery.

Von Neumann’s Cathedral would have been a fine title for this book, or better Von Neumann’s Bomb Bay. This is a book about von Neumann, not Turing, and his monomaniacal dream of building a computer. Von Neumann’s motivation was mainly militaristic: to leverage computational power for simulation of complex phenomena, such as the physics of nuclear bombs. As such, the early history of computing co-evolves at the ENIAC project in Princeton and the Manhattan project in Los Alamos. This is a story worth telling, and from that perspective, Dyson’s book is a book worth reading. Just remember to read the technological passages like the techno-babble in a Star Trek episode: It’s window-dressing, not there in order to be understood.

The final prophetic chapters about the future of computation and mankind are worthless and incompetent.

In summary, a misleadingly-titled, meandering, technologically illiterate, annoying, beautifully written, confused, and sometimes entertaining book about an important topic.
Profile Image for Bryan Alexander.
Author 4 books304 followers
September 18, 2014
I'm fascinating by the history of computing. There are so many delights there, both geeky and otherwise: glimpses of our present, odd characters, brilliant technical solutions, politics. Turing's Cathedral is a delightful and useful contribution to this field.

George Dyson's book takes place during the 1940s and 1950s, focusing on the extraordinary collection of geniuses in Princeton's Institute for Advanced Study, and who created a huge amount of modern computing. A large part of Turing's Cathedral consists of biographical sketches of key players in early computing, including Stan Ulam, Nils Barracelli, Klari Dán Von Neumann, Thorstein Veblen's nephew, Alan Turing, of course, and especially Johnny von Neumann. This is really von Neumann's book. He gets the lion's share of the text (and photos), becoming the Institute's prime driver and most productive thinker. Turing's Cathedral essentially ends with von Neumann's death.

Other chapters explore the hardware and theory of early computing, from cathode ray tubes and punch cards to stored program architecture and the Monte Carlo method. Dyson also races off in related directions with something like glee, showing, for example, how the IIS was right on top of major Revolutionary War sites, and how one scientist was related to Olivia Newtown John.

Arching across all of these themes is the intertwined history of atomic weapons and computing. This is a controversial theme in cyber-history, and Dyson assembles a great deal of evidence to demonstrate their deep connections.

Turing's Cathedral is filled with energy and some fun writing. A few quotes demonstrate this:
What if the price of machines that think is people who don't? (314)

"Information was never 'volatile' in transit; it was as secure as an acrophobic inchworm on the crest of a sequoia." -Julian Bigelow (137)

Our ever-expanding digital universe is directly descended from the image tube that imploded in the back seat of [Vladimir] Zworykin's car. (81)

We owe the existence of high-speed digital computers to pilots who preferred to be shot down intentionally by their enemies rather than accidentally by their friends. (116)

Von Neumann made a deal with "the other party" in 1946. The scientists would get the computers, and the military would get the bombs. This seems to have turned out well enough so far, because, contrary to von Neumann's expectations, it was the computers that exploded, not the bombs. (303)

"It is an irony of fate that much of the high-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not." -Francoise Ulam (216)

"A clock keeps track of time. A modern general purpose computer keeps track of events." This distinction separates the digital universe from our universe, and is one of the few distinctions left. (300)

"We are Martians who have come to Earth to change everything - and we are afraid we will not be so well received. So we try to keep it a secret, try to appear as Americans... but that we could not do, because of our accent. So we settled in a country nobody ever has heard about and now we are claiming to be Hungarians." -Edward Teller (40)


The book has plenty of gems, like the Swedish scientist Hannes Alfvén and his utopian novel about an AI-ruled future, von Neumann's possibly accidental vision of what post-robotic economics might look like (289), and Nils Barracelli's early visions of artificial life. The sheer ambition, nigh unto mad science, of these early explorers, seeking even to control the weather, is infectious.

There are a few weaknesses, starting with Dyson's aphoristic tendency occasionally backfiring (I don't buy the return of analog idea). Some of the explanations aren't as lucid as they need to be - I'm still not sure what template-based addressing means. And the focus on Princeton and some historical figures doesn't recognize their extraordinary privilege. I was also surprised that Tyson didn't reference Howard Rheingold's groundbreaking and accessible Tools for Thought.

But that's nitpicking. Turing's Cathedral is a treat for geeks, history buffs, and anyone interested in how our digital era came to be.
Profile Image for Greg.
19 reviews
March 29, 2013
The title is a little misleading. This book is mostly a biography of John von Neumann and concurrently, a story of the early decades of the Institute for Advanced Study in Princeton. The stories are well researched and rich in detail, but at times hard to follow. I think this comes from abrupt changes in the timeline within related chapters. What comes across clearly is the value of interdisciplinary collaboration among genius level scientists and engineers in the presence of new electronic tools. von Neumann is a perfectly chosen example in that regard. I'm also grateful to the author for including and highlighting valued wives, technologists, secretaries, etc. who were important either to the groundbreaking work itself or to the stability of the community, without which progress would undoubtedly have suffered. This is actually more than a three star effort, but not quite four stars. Thank you Mr. Dyson for this enlightening work. And thank you to Powell's (Portland, OR) for carrying it on your shelves.
Profile Image for Holly.
1,056 reviews267 followers
August 13, 2016
Though I would never dare to participate in anything but a cursory, general conversation about Turing machines or Gödel's theorem or the Monte Carlo method, I just love histories of science and this book made me happy. I listened to the audio version, and often arcane concepts requiring visualization or anything involving equations would blow past me, but all the wonderful details and biographies and momentum more than made up for my muddled moments. I love that it started with a disorienting-orienting historical story about the ground on which the Princeton Institute for Advanced Study was built (with details about the Lenni Lenape Indians and then William Penn!?), then the Flexner family, and Thorstein Veblen and his brother .... who'd have thought? and weather predicting, and Los Alamos, and Einstein, and the roles of women - such as Clari von Neumann - whom I'd never read about, etc. It's all here. I also liked the book's spiral structure that gradually moved linearly forward but always, by necessity, returned to von Neumann, the true center of the book. (Like every other reader I feel obligated to mention that the eponymous Turing had very little screen-time.)
Profile Image for Alan.
1,166 reviews136 followers
December 12, 2012
Turing's Cathedral is a long, enthusiastic and articulate ramble throughout the early history of computing, a solid work constructed over a great deal of time by a keen observer who has an insider's perspective on many of that history's most pivotal moments. George Dyson is the son of the famous physicist Freeman Dyson, and as a child he must have met many of the principals of this story while they were working at the Institute for Advanced Study (IAS) in Princeton, New Jersey (although at the time his historical interests were, I'm sure, not yet fully developed). Dyson's chapter of Acknowledgements tells the tale of the many surviving participants he interviewed and the many records the archivists of the IAS unearthed for him during the creation of this book; he notes that "Julian Bigelow and his colleagues designed and built the new computer in less time than it took me to write this book." (p.xvi)

But... Turing's Cathedral isn't going to be for everyone. It is quite a ramble, for one thing—there's definitely a forward, linear flow to Dyson's overall narrative, but it's full of eddies and backwaters, digressions and diversions. It's also a very thorough story about math and machines, and the military imperatives—like ballistics computations, code-breaking, and the simulation of blast effects for new kinds of explosives—that drove the transition of "computers" from being a job description for women who operate adding machines in rooms to describing self-contained electronic devices that filled rooms. You should already have at least some familiarity with computer architecture and design, in order to appreciate the raw roots of the technology that are exposed here—vacuum tubes and wires, to be sure, but more importantly their underlying logic: deep concepts, like memory addresses, Boolean arithmetic and self-modifying code, the very notion of "software" itself, that still make up the mostly-unseen bedrock of today's graphically-oriented operating systems.

And... I'll admit to feeling a bit misled, at least to start with—despite its name, Turing's Cathedral's early chapters seem not so much about Alan Turing himself as they are about John von Neumann and the IAS... in fact, George Dyson sends his gaze back all the way to William Penn's acquisition of the land that later became the Institute, before slowly moving forward into the 20th Century.

It is not until Chapter 13 that Dyson digs into what Alan Turing really meant to the history (and future) of computing. In this chapter, Dyson takes us from the initial, purely mathematical insights of Turing's Universal Machine, through Turing's more concrete notion that an artificial intelligence must be grown, not made (a sentiment with which I find myself wholly in agreement)... to Dyson's own final observation that our search engines are now searching us. This chapter is the core of the book, the part that most fully justifies its name.

After Dyson is done, for the most part, with history, he goes on to engage in some intriguing speculations, such as his musing (on pp.278-281) about the resurgence of analog, continuous modes of computing, and what that might mean for the segmented digital universe he spent so much time explaining in the earlier parts of the book. Along the way he also has a couple of interesting book recommendations; I'm going to have to look up Aldous Huxley's lesser-known post-apocalyptic work Ape and Essence sometime, for example. And... I'll admit I'd never seen this statistic put this way before:
Global production of optical fiber reached Mach 20 (15,000 miles per hour) in 2011, barely keeping up with the demand.
—p.300
Computers are ubiquitous now, and intimately interconnected... but there was a time not so long ago when they weren't. George Dyson's Turing's Cathedral is an instructive and often entertaining examination of the moment just before that amazing phase transition took place.
May 27, 2012
The IAS MANIAC project was indeed a truly revolutionary computing endeavor, and it deserves a well-written history. Unfortunately, you will not find it here. Dyson doesn't seem to understand most of the technical issues he tries to describe, and he often resorts to vague attempts at seemingly profound statements (see the end of almost every chapter for examples). Dyson is at his best when he describes the personalities of those who contributed to the project, but this doesn't really save the work as a whole.

The book is also suffused with Dyson's theory that the Internet is a living (and possibly sentient) creature. This kind of idiocy is annoying enough by itself, but it really has no place in a history of a computing project. The most concentrated dose of this is found in the latter half of the chapter on Nils Barricelli, so feel free to skip it -- Barricelli had some interesting ideas, but in the end I think he falls closer to the side of "crank" than "visionary".

Finally, the writing is often a bit tortured and hard to follow, and his biographies jump confusingly between various parts of the individual's life, seemingly whenever the author thought of something new to add. These sapped what was left of my will to finish the book, and had I not been on transatlantic plane flight with no other source of entertainment I doubt I would have made it.

I hope someone comes along and does a better job of this, because the early history of computing is a story that deserves the same kind of treatment that the Manhattan Project got in Richard Rhodes's "The Making of the Atomic Bomb".
Profile Image for John Behle.
227 reviews27 followers
March 4, 2013
In several reviews, this book has been called a nerd's labor of love. Okay, but it is also exceptionally well written. The sentences are crafted to keep pulling one in to the action. This is not a direct timeline book, though. Dyson introduces the players as they enter the drama of advancing computing.

It is not bog down with old techno speak and specifications. Dyson sprinkles in the interesting facts just as needed. The massive 30 ton computers of the late 1940s did have over 17,000 vacuum tubes. Yes, there were programs with one million punch cards, with run times measured in weeks.

But the very smart people who wrote the programs and built those machines had fun, foibles and flaws. Dyson is quite the raconteur and weaves these personalities in to tell his tale.

He wraps with how so many of the visions of '40s and '50s have indeed become the way we share information. The ease of access and the freedom to contribute to a worldwide forum like goodreads is a stunning example of how far we have come.

Arthur Morey is a velvet hammer of a narrator. His distinct style and precise pronunciation added to the enjoyment of this book for me.
Profile Image for Tom Lee.
202 reviews30 followers
December 28, 2012
I keep this photo over my desk at work. I think it looks a bit like a microscopic close-up of a drop of milk, or maybe a bacterial colony. In fact it's a shot of the Trinity Test, the planet's first atomic detonation. To me, this event and the context surrounding it are the most fascinating and amazing chapter in all of human engineering: in a panicked fight against evil, a collection of human intelligence was assembled that, through sheer intellectual might, wielded abstract mathematics and applied engineering to bend reality in an astounding new way. It's hard to imagine an engineering problem with such dizzying historical, moral and political consequences.

George Dyson has delivered a fascinating and flawed book that connects this project to my own profession: digital computing. He tells the story of Princeton's Institute for Advanced Study and its quest, led by the brilliant John Von Neumann, to build one of the world's first electronic computers, a project that was birthed by -- but, Dyson argues, destined to be even more transformative than -- the US military's atomic weapons program.

Dyson is the son of Freeman Dyson, and this is his greatest asset: he actually grew up among these brilliant minds. This puts him in an incredible position to describe what these men and women were like, and he does a very fine job. Von Neumann's own reticent, complicated brilliance leaves him something of an enigma, but Dyson conveys this well. And, although generally shorter, the portraits Dyson draws of Julian Bigelow, Alan Turing, Kurt Godel, Stan Ulam and Klari Von Neumann -- all of whom (astoundingly) were personally involved in this story in one way or another -- are fascinating, inspiring and heartbreaking.

But this book has problems.

I'll start with a quibble. The beginning is pretentiously overloaded: it's hard to imagine why a reader would want or need an explanation of William Penn's immigration and the events that led to George Washington once marching through what would become the IAS's backyard. But the real sins occur later in the book.

The book itself diagnoses Dyson's failings via a quote from Turing on page 263: "I agree with you about 'thinking in analogies,' but I do not think of the brain as 'searching for analogies' so much as having analogies forced upon it by its own limitations." Dyson doesn't take this limitation seriously. Having ably and charmingly described the creation of general-purpose digital computing, Dyson is incapable of critically evaluating the musings of its creators. Having built the a-bomb, these inventors -- understandably -- could be similarly oblivious as they applied computational metaphors to problems of biology and the mind.

These can be helpful conceptual frames, but Dyson is not equipped to see their limitations, or to acknowledge the modest returns they have yielded over a subsequent half-century of investigation. The stories he tells are instead about avenues of investigation cut short by untimely deaths, the military industrial complex or short-sighted academic administrators. He doesn't acknowledge that Barricelli's ideas about evolutionary algorithms, for instance, have continued to be studied, and have become a useful but far-from-universal (or life-creating!) technique.

By the end, Dyson has descended into mysticism. He misreads Turing's discussion of o-machines as a tragically unrealized proposal rather than a not-implementable thought experiment deployed for theoretical ends. He thinks search engines and "Web 2.0" are evidence of the kind of emergent properties associated with guesses about the spontaneous origins of consciousness. He points to the complexity of online social networks as exemplars of new forms of computation, subtly implying that this may have philosophical significance (without bothering to ask himself what, then, a market economy, postal system or beehive might be computing). He genuinely seems about half-convinced that extraterrestrials have transmitted themselves digitally into our computer networks, where they reside, hidden. On page 293 he visits with an elderly Edward Teller, to whom he explains this last theory. Teller, gently and sensibly, suggests that Dyson write a science fiction novel instead of his current project. It's good advice.

Certainly, others have made these mistakes before. One can hardly blame the creators of this incredibly powerful technology for optimistically imagining applications beyond its eventual reach. A great example comes in Chapter 9, which tells the story of the birth of computational meteorology. The meteorological status quo felt that their discipline was destined to remain an art; some upstarts felt that computational approaches were the path forward. The latter camp, with Von Neumann as their midwife, were thoroughly vindicated both in their own time and the decades since. Yet one ought to note Von Neumann's triumphalist predictions that, once weather systems could be predicted, manipulating them would be trivial. He predicted weather control and meteorological warfare! This has proven to have been a wild overestimation of that problem's tractability. Yet as the book progresses, Dyson takes the IAS staff's increasingly implausible speculations about consciousness and man's eventual subservience to machine and spins them out through his own, much-less-grounded imagination (Cory Doctorow thinks this is all great stuff, by the way).

The basic problem is that Dyson doesn't truly understand much of the technology he's writing about. His grasp of technical detail is often very good for a non-engineer. But he lacks the virtuosic comprehension that made the individuals at the heart of this story so remarkable, and which is a prerequisite for the kind of speculation he wishes to indulge in.

Let me be quick to add that I don't have that kind of virtuosity -- aside from a general lack of brilliance, the demotion of mathematics from computer science's essential foundation to its mere supplement (perhaps inevitable as CS's complexity has increased) arguably makes it much harder to achieve that kind of understanding these days -- but the big picture should be evident to anyone who's read even a little Hofstadter. The conceptual story here is about abstraction, the Unreasonable Effectiveness of Mathematics, and the doors opened by driving computation to a rate that can achieve results that are unattainable through more elegant and precise theoretical methods*. Instead Dyson often gets bogged down in meaningless errata about dimensionality, floating point arithmetic and whether data is represented spatially or temporally. The point of the story is that none of this matters! But Dyson doesn't grasp this. On pages 255 and 256, in particular, it becomes clear that the universality of the Turing Machine -- the whole point of the damn idea -- is lost on him.

This is a shame, and it makes much of the book's end a waste. Things really start go off the rails in the chapter on Turing (the book is largely organized into chapters focusing on individuals, which proves to be a wise choice), though Engineer's Dreams, the chapter that follows it, is well worth reading for its portrait of Von Neumann's inability to confront death -- it's a highlight of the book -- even if it then descends into some of the book's most risible speculation. Klari Von Neumann's fate, explained at the book's very end, also packs emotional weight, though you'll have had to wade through a lot of nonsense to get there.

Still, this book was a pleasure, and I'm grateful to Dyson for his portrait of a remarkable time filled with remarkable people. Highly recommended as a history, but whatever you do don't take its analogizing and speculation seriously.

* to be fair, Dyson is actually very good on this last point
Profile Image for Ints.
780 reviews77 followers
January 9, 2016
Uz šo grāmatu acis metu sen, kopš tā parādījās 2012. gadā. Mani vienmēr ir interesējuši skaitļošanas mašīnu pirmsākumi, gribējās vairāk uzzināt, kā no Tūringa universālā skaitļotāja radās mūsdienu datori, kādas tehnoloģijas izmantoja pirmsākumos, un galu galā, ja paveicas, iespējams, saprast, kā tas viss notiek dzelžu līmenī. Kad grāmatu kārtējo reizi sastapu Dubajas grāmatu veikalā, padevos un nopirku.

Grāmatas anotācijā tiek solīts, ka autors izvērsti un izjusti parādījis ciparu visuma sprādzienu pēc Otrā pasaules kara beigās. Izgaismos elektronisko skaitļotāju dabu un to cilvēku dzīves, kuri pielikuši savus prātus, lai modernie datori vispār rastos. Pagājuša gadsimta četrdesmitajos un piecdesmitajos gados Džons fon Neimans Prinstonā sapulcināja spožākos matemātiķus un mehāniķus, lai realizētu Tūringa ideju par Universālo Mašīnu. Sākumā digitālais visums bija vien piecu kilobaitu liels – mūsdienās vienas ikonas atainošanai aiziet vairāk resursu, taču ar to pietika, lai simulētu kodolsprādzienus, evolūciju un pavērtu durvis digitālajai revolūcijai.

Ar iesaistīto cilvēku dzīvēm viss ir kārtībā, tās ir aprakstītas līdz sīkumiem. Autors ar entuziasmu ataino katra jaunā projekta dalībnieka izcelsmi līdz vectēvam, to ar ko šis precējies, ja ir un laiku pa laikam kādai sievai paveicas, un arī viņas dzīves gājums uz pāris lapaspusēm tiek izklāstīts. Priekš manis tā bija grāmatas garlaicīgākā un lielākā daļa. Man nav nekas pret biogrāfijām un, iespējams, pats vien esmu vainīgs, vajadzēja būt uzmanīgam lasot anotāciju. Šīs biogrāfijas manu lasītprieku beidza nost, nu neinteresē mani, kā Neimanis ar savu sievu saticies, vai kā katrs mucis no Eiropas. Vietām vieni un tie paši fakti ir jālasa pat pāris reizes. Taču, ja slavenu fiziķu un matemātiķu dzīves ir Tava vājība, te varēsi izvērsties uz nebēdu.

Paralēli tam visam rit arī Prinstonas universitātes pieguļošas teritorijas vēstures izklāsts. Mēs sīki un smalki uzzinām par katra šķūņa vēsturi. Tas var būt pat no revolūcijas laikiem. Ja būvēts kas jauns, lasītājam ir iespēja uzzināt, kurš ir bijis pret, kurš par. Cik tas maksājam, un par cik lielu summu sākotnējā tāme atšķīrās no beigu cipara. Kurš to finansēja, un kuram zinātniekam bija gali ar finansētāju. Ja uz visu to neskatās ar vēlmi redzēt, kā ASV aizsardzības struktūras attīstīja jaunus ieročus un bija galvenais skaitļošanas tehnoloģiju finansētājs, tad arī šī sadaļa ir drausmīgi garlaicīga.

Taču no visas grāmatas aptuveni ceturtdaļa ir visnotaļ interesanta. Izņemot vietas, kur autors propagandē no pirksta izzīstus apgalvojumus vai nepamatoti postulē interneta saprātīgumu. Tajā patiesi stāstīts par agrīnajām skaitļošanas mašīnām un problēmām, kuras ar tām risinātas. Laikos, kad viss balstījās uz lampām, galvenā problēma bija, kā no daudziem neuzticamiem elementiem uzbūvēt uzticamu skaitļošanas sistēmu. Papildus problēmu sagādāja atmiņa, un te nu inženieri lika lietā visu izdomu, nesmādēja pat konvertēt no digitālā signāla uz akustisko, un tad, izmantojot dzīvsudrabu, saglabāt rezultātu pāris milisekundes līdz nākošajam ciklam. Pamatprincipu aprakstam gan piemīt saraustīta stāstījuma iezīmes. Vietām pat rodas priekšstats, ka autors pats lāga neapjēdz, ko viņš ir vēlējies pateikt.

Toties viss ir kārtībā ar to, kas saistās ar simulācijas iespējām izmantojot piecu kilobaitu atmiņu. Pilnīgi neticas, bet tika izveidoti pirmie metroloģijas modeļi, kas pat reizēm spēja izrēķināt prognozi pāris dienas pēc prognozes datuma, labi nosimulēt kodoltermisko reakciju radīto sprādzienvilni un tā ietekmi uz bumbas struktūru un pasauli apkārt. Daži entuziasti testēja pirmos digitālos radījumus, un bija tikai pāris soļus līdz datorvīrusa radīšanai. Par to kā simulāciju starpposmus rezultātus lasīja no katodstaru lampas, perfokartes vai magnetizēta vada.

Kopumā 7 no 10 ballēm. Īsti nesapratu, vai autoram vispār ir nojausma, kā pirmie elektroniskie skaitļotāji strādāja, varbūt viņš vienkārši neprata to paskaidrot man saprotamā veidā. Iespējams biju pārpūlējies lasot cilvēku biogrāfijas un faktiem par mašīnām, kas kā likums bija atrodami nodaļu beigās, vairs nepietika spēka. Lasīt jau var, ir daudzas interesantas lietas, taču viss ir nedaudz izplūdis un noslēpts.
Profile Image for Will Ansbacher.
326 reviews93 followers
February 1, 2015
This is such a maddening book! Is it history? Is it biography? Is it science? Is it speculation? Well, that would be yes, yes, no and yes.

It’s not quite what I was expecting as it has little to say about Turing and his theories or the Colossus machine he is known for (although that’s my fault for not reading the blurbs). Rather it’s about the subsequent computer revolution that developed from it after WW2, and the ENIAC computer in particular.

But this book is not only about the mathematicians and computer engineers who built ENIAC; it’s also about the environment in which it “grew up”. Here’s where it all gets maddeningly muddled. Turing’s Cathedral mixes the historical and biographical rather haphazardly. Dyson brings in all the players (and I do mean ALL - like a Dostoyevsky novel, the book even begins with a 5-page list of “principal characters”) – John von Neumann, his wife, Teller, Feynmann and the rest, but it’s not linear in time: John keeps popping up in earlier periods in later chapters; the real purpose of ENIAC – to perform calculations for the H-bomb project (which was of course secret at the time) is mentioned but isn’t discussed until nearly midway through the book. It’s preceded by a number of cover projects such as weather calculations and stellar astrophysics that really didn’t need their own separate chapters, and especially not potted biographies of all the researchers involved, and their ancestors! Dyson even veers into speculation with a chapter on biology calculations and self-reproducing automata. That’s quite apart from the chapters on the actual construction of ENIAC.

There are even longish chapters on the philanthropists who founded Princeton’s Institute of Advanced Studies where ENIAC was built, and the claustrophobic and snobbish society that apparently enveloped Princeton. All moderately interesting but really quite irrelevant.

Topics are introduced out of sequence and abandoned abruptly, and the focus is often obscure – e.g., the chapter about the H-bomb is buried in one called Ulam’s Demons as though the physicist Ulam was the principal player in the whole thing, and there is nothing in the chapter about the aftermath or morality of the bomb, the destruction of Eniwetok atoll and so on, it simply ends with an unrelated problem that Ulam posed.

The other maddening thing is that his explanations are such crap. Dyson quotes extensively from the people he interviewed but without interpreting any of it; in a way it’s much more like journalism than science writing. I don’t know if Dyson simply doesn’t understand much of what he was told or read about (the interviews and notes, at least, are extensive), or whether he thought his readers would not understand or find it too boring. But in this book about digital computing there isn’t even one numerical example. Instead there is endless, pointless speculation about whether computers really “think” or could reproduce.

There are only two places where Dyson seems to be in his element – one, in the extensive interviews he conducted over some ten years (he appears to be a very enthusiastic interviewer); and two, in a curious chapter where he waxes poetic about the Scandinavian physicist Alfven who wrote a forgotten speculative novel about computers taking over the world. The rest seems like Dyson bluffing his way through.

To me, this little excerpt says it all: in an interview with Teller, they inevitably drift from computers and the H-bomb to extraterrestrial life. The aging physicist asks Dyson what he thinks about it; afterwards, there is a long, critical pause while Teller contemplates Dyson’s answer, then he pronounces,
“Look ... instead of explaining this ... you should write a science fiction book about it”.

Profile Image for Raghu Chilukuri.
6 reviews4 followers
March 30, 2014
I have no idea why people claim this book is so bad. I agree the narration is non-linear, and possibly confusing, but it doesn't deserve all this flak.

I'm not sure if these ranting people understand the concept of non-linear story-telling. There are people who said "I'm not so technical, but..." and some are ready to burn the book for not explaining von Neumann architecture in detail. I remember the book mentioning the ability to store code and data in the same place (address space) -- I'm not sure how deep one should go to explain this to a spectrum of readers.

Some people (including me) have reservations about claims of cellular automata, the web being analogous to a cell and in this respect, it is "analog" engine made of digital computer, just like DNA in a cell. However, if you're intelligent enough to put down the book as rubbish, I wonder if they can't even distinguish facts and speculation (speculation of possible applications of some idea - which is plentiful in pop-sci books).

People should remember that computers were mostly 'invented' to solve scientific computation problems like hydrogen bomb, weather modelling, anti-aircraft radar, cipher breaking etc. described in the book, preceding the discovery of transistors or such other knowledge which, for us, now is 'obvious' or 'logical'. For that matter, the 'von Neumann architecture' which we now call easy or logical or perfect choice wasn't completely the obvious choice. In fact, several computer scientists were (are?) proposing alternate ideas. For example, see this paper by Turing award winner John Backus, who worked in compiler theory: http://www.thocp.net/biographies/pape...

I found it good to study the lives of scientists and engineers who were building the first generation computers; their challenges, their internal rivalries and often conflicting aims. It is good to remember that these days, most innovations are the result of group work and gone are the days when one person locked himself in a lab and emerged days/years later with a theory/device. And I think this book made quite a good effort at presenting such details, although not always page-turning.

I liked this book pretty much overall, and would definitely recommend it to friends.
7 reviews
March 24, 2013
I enjoyed reading this, and learned several new things while doing so. The book is not at all about Alan Turing. If it is a biography of anybody, it is John von Neuman; but really it is about many people, centered around the IAS in Princeton, who played a role in early computer development. There is also a lot of discussion about the development of nuclear and thermo-nuclear weapons, as one of the first applications of electronic computing.

Two big downsides prevent me from rating this book higher:

1) Too much historical minutia. This is my number one complaint about this genre, so maybe I'm more impatient than most. However, I was several times frustrated to find myself reading far too many pages about things like the pre-european history of the land on which the IAS would eventually be built, or the when, who, what and how the IAS dealt with housing shortages by relocating housing from elsewhere.

2) The author's apparent lack of understanding of the computer architectures he was describing. As a computer engineer, many of the problems, ideas, or devices described intrigued me. Unfortunately, the ideas which were new to me I had to look elsewhere to learn anything meaningful about, and the descriptions of concepts I was familiar with I found to be somewhere between confusing and outright misleading. In light of this, I am highly suspect of many of the authors claims in the latter sections of the book regarding, e.g., search engines as a return to analog computing, or cellphone "apps" as equivalent to Baricelli's "symbiogenesis" simulations. It is my impression that he has completely misunderstood and misrepresented these concepts.

Basically, don't read this for its technical content, and please don't take his interpretations too seriously. However, it is a thorough, well researched, and well written historical account of an exciting time of technology development.
Profile Image for Paul.
72 reviews6 followers
August 22, 2012
A disappointment, mainly due to a lack of coherent organization. Dyson assembles a great deal of information, anecdote, and explanation -- some of it fascinating and engaging, but not all of it lucid -- without providing the necessary connecting tissue. The reader is left to do the author's work. This baggy enterprise made me go back to a book by Steve Heims called JOHN VON NEUMANN & NORBERT WIENER: From Mathematics to the Technologies of Life and Death, published some thirty years ago, now sadly out of print. It was a lot clearer.
Profile Image for Book Shark.
772 reviews147 followers
March 9, 2012
Turing's Cathedral: The Origins of the Digital Universe by George Dyson

"Turing's Cathedral" is the uninspiring and rather dry book about the origins of the digital universe. With a title like, "Turing's Cathedral" I was expecting a riveting account about the heroic acts of Alan Turing the father of modern computer science and whose work was instrumental in breaking the wartime Enigma codes. Instead, I get a solid albeit "research-feeling" book about John von Neumann's project to construct Turing's vision of a Universal Machine. The book covers the "explosion" of the digital universe and those applications that propelled them in the aftermath of World War II. Historian of technology, George Dyson does a commendable job of research and provide some interesting stories involving the birth and development of the digital age and the great minds behind it. This 432-page book is composed of the following eighteen chapters: 1.1953, 2. Olden Farm, 3. Veblen's Circle, 4. Neumann Janos, 5. MANIAC, 6. Fuld 219, 7. 6J6, 8. V-40, 9. Cyclogenesis, 10. Monte Carlo, 11. Ulam's Demons, 12. Barricelli's Universe, 13. Turing's Cathedral, 14. Engineer's Dreams, 15. Theory of Self-Reproducing Automota, 16. Mach 9, 17. The Tale of the Big Computer, and 18. The Thirty-ninth Step.

Positives:
1. A well researched book. The author faces a daunting task of research but pulls it together.
2. The fascinating topic of the birth of the digital universe.
3. A who's who of science and engineering icons of what will eventually become computer science. A list of principal characters was very welcomed.
4. For those computer lovers who want to learn the history behind the pioneers behind digital computing this book is for you.
5. Some facts will "blow" you away, "In March 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth".
6. Some goals are counterintuitive. "The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms".
7. There are some interesting philosophical considerations.
8. As an engineer, I enjoy the engineering challenges involved with some of their projects.
9. Amazing how the Nazi threat gave America access to some of the greatest minds. The author does a good job of describing these stories.
10. The fascinating life of the main character of this book, John von Neumann.
11. So much history interspersed throughout this book.
12. The ENIAC..." a very personal computer". A large portion of this book is dedicated to the original computer concepts, challenges, parts, testing, etc...
13. The fundamental importance of Turing's paper of 1936. It's the inspiration behind the history of the digital universe.
14. Some amusing tidbits here and there, including Einstein's diet.
15. The influence of Godel. How he set the stage for the digital revolution.
16. Blown away with Leibniz. In 1679, yes that is correct 1679 he already imagined a digital computer with binary numbers...
17. So many great stories of how these great minds attacked engineering challenges. Computer scientists will get plenty of chuckles with some of these stories involving the types of parts used in the genesis of computing. Vacuum tubes as an example.
18. There are many engineering principles devised early on that remain intact today. Many examples, Bigelow provides plenty of axioms.
19. I enjoyed the stories involving how computers improved the art of forecasting the weather.
20. "Filter out the noise". A recurring theme and engineering practice that makes its presence felt in this book.
21. Computers and nuclear weapons.
22. The Monte Carlo method a new, key domain in mathematical physics and its invaluable contribution to the digital age.
23. The fascinating story of the summer of 1943 at Los Alamos.
24. The Teller-Ulam invention.
25. How the digital universe and the hydrogen bomb were brought into existence simultaneously.
26. Barricelli and an interesting perspective on biological evolution.
27. The amazing life of Alan Mathison Turing and his heroic contributions.
28. A fascinating look at the philosophy of artificial intelligence and its future.
29. The collision between digital universe and two existing stores of information: genetic codes and information stored in brains.
30. The basis for the power of computers.
31. The five distinct sets of problems running on the MANIAC by mid-1953. All in JUST 5 kilobytes.
32. A look at global digital expansion and where we are today.
33. The unique perspective of Hannes Alfven. Cosmology.
34. The future of computer science.
35. Great quotes, "What if the price of machines that think is people who don't?"
36. The author does a great job of providing a "where are they now" narration of all the main characters of the book.
37. Links worked great.
38. Some great illustrations in the appendix of the book. It's always great to put a face on people involved in this story.

Negatives:
1. It wasn't an enjoyable read. Plain and simple this book was tedious to read. The author lacked panache.
2. The title is misleading. This title is a metaphor regarding Google's headquarters in California. The author who was given a glimpse inside the aforementioned organization sensed Turing's vision of a gathering of all available answers and possible equations mapped out in this awe-inspiring facility. My disappointment is that this book despite being inspired by Alan Turing's vision, in fact, has only one chapter dedicated to him. The main driver behind this book was really, John von Neumann.
3. A timeline chart would have added value. With so many stories going back and forth it would help the reader ground their focus within the context of the time that it occurred.
4. Some of the stories really took the scenic route to get to the point.
5. The photos should have been included within the context of the book instead of a separate section of its own.
6. The book was probably a hundred pages too long.

In summary, I didn't enjoy reading this book. The topic was of interest to me but between the misleading title and the very dry prose, the book became tedious and not intellectually satisfying. The book felt more like a research paper than a book intended for the general audience. For the record, I am engineer and a lot of the topics covered in this book are near and dear my heart but the author was never able to connect with me. This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital origins but I felt like I was reading code instead of a story. This book will have a limited audience; if you are an engineer, scientist or in the computer field this book may be of interest but be forewarned it is a monotonous and an uninspiring read.

Recommendations: "Steve Jobs" by Walter Isaacson, ""The Quantum Universe" by Brian Cox, "The Physics of the Future" Michio Kaku, "Warnings: The True Story of how Science Tamed the Weather" by Mike Smith, "Spycraft" by Robert Wallace and H. Keith Melton.
Profile Image for John Gribbin.
157 reviews104 followers
August 23, 2013

The title of George Dyson’s latest book about the scientists who worked at the Institute for Advanced Study in Princeton during its glory days is a little misleading; the story is not so much about Alan Turing, the man who came up with the idea of the modern computer, but John von Neumann, who did more than anyone else to make it a practical reality. No matter; like Dyson’s previous books, thus is a glorious insight into how science -- in this case, computer science -- was done at Princeton in the middle decades of the twentieth century. It is as much a story of people and personalities as a story of the science which they were involved in, and you do not need any knowledge of computers or mathematics to enjoy the ride.

This time, Dyson sets the historical context of the origin of the Institute itself, before the story proper takes off with the dramatic early life of Hungarian-born von Neumann, leading up to his arrival in Princeton in 1931, one of the first major scientists to see the way the political wind was blowing in Europe and get as far away from the Nazi threat as possible. In the process, Hungarian Janos became “Johnny” to his friends and colleagues; he became a US citizen in 1937.

By then, Turing had published his paradigm-shifting paper “On Computable Numbers”, which spelled out the basis of the modern computer, in terms of what we now call hardware and software. He even established the possibility of a “universal computer”, now called a Turing Machine, which could simulate the activity of any specialist computer, using different sets of software. This is exactly what my iPhone does. It can be a phone, a TV, play chess, solve certain kinds of mathematical problems, and do many other things. It can even do things its designers never thought of, as when an outside programmer devises a new app. Most people in the developed world now own a Turing Machine, less than eighty years after the publication of “On Computable Numbers”.

Turing visited Princeton in the 1930s, and even completed a PhD there, although by then he hardly needed one. His own future in computing famously involved working at Bletchley Park in World War Two as a key (perhaps the key) member of the team that cracked the German Enigma code. This led on to post-war work on computers, crippled by a lack of funding, and his untimely death.

In the US, the impetus for developing the electronic computer, using Turing’s ideas which von Neumann picked up and ran off with, came from the effort to develop the hydrogen bomb. As a bonus to the main thread of his history of computing, Dyson provides one of the clearest succinct accounts of the machinations involved in that project that I have ever seen. The outcome was that for that project, in that country, funding was not a problem, and Britain was soon left far behind in the art of manufacturing fast, number-crunching machines. There is an element here of other familiar tales. Turing’s approach was more elegant, and would probably have led to more versatile machines more quickly, given proper support; the Princeton (especially, the von Neumann) approach was brute force, succeeding by building bigger and faster without necessarily being better.

The cathedral of Dyson’s title comes in to the book late on, and in the context of a statement made by Turing in 1950, concerning the idea of machines that think:
In attempting to construct such machines we should not be usurping His power of creating souls, any more than we are in the procreation of children: rather we are, in either case, instruments of His will providing mansions for the souls that He creates . . .

Slightly curious terminology, coming from an atheist, but highlighting Turing’s belief that computers would one day become self-aware and able to think in the same way that we do -- or better. As Turing’s wartime assistant Jack Good once put it, “the ultraintelligent machine . . . is a machine that believes people cannot think.”

One of the standard tests which may one day convince people that a machine can think was devised by Turing and is called the Turing Test. In this, a human investigator communicates with a machine and another person by some impersonal means (perhaps, these days, by email) and has to decide which is which by posing a series of questions. No computer has yet passed the test, but the heirs of Turing and von Neumann believe it is only a matter of time.

How much time? Dyson points out that the first transistor radio, sold in 1954, contained just four transistors. Today, after allowing for inflation, for the same cost as that radio you can buy a computer with the equivalent of a billion transistors (which, among other things, will simulate a radio). And now, people are developing computers based on quantum principles, as far in advance of “classical” computers as the classical computer is in advance of the abacus. As the man said, you ain’t seen nothin’ yet. If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.

This first appeared in the Literary Review
Profile Image for May Ling.
1,074 reviews286 followers
May 24, 2021
Summary: Great book on the genesis of the computer with a cast of characters that any math or physics nerd will find to be all-star. Even Einstein shows up.

I really enjoyed hearing the actual story of how the computer came to be before and after the world wars that drove it in a particular direction and toward a particular set of use cases. You also the early days of IBM and RCA.

A few other reflections after letting it digest. The book is incredibly well-researched and Dyson really seems uniquely able to tell it. I'm giving it 5 stars b/c I think it deserves a higher rating. it's unique in it's genre, i.e. history of technology. I think more techies should read up on the beginnings of what happened and why. Currently, it's not considered quite relevant to the field, which I believe is a mistake.

p. 5 “Being digital should be more interesting than being electronic”’ Turing pointed out”

p. 7 Jack Rosenburg mentions: “All I was told was that what Metropolis came out for was to calculate the feasibility of a fusion bomb,” .., “that’s all I knew. And then I felt dirty. And Einstein said that’s exactly what I thought they were going to use it for.’ He was way ahead.”
p. 45 Theory of Games and Economic Behavior…. Von Neumann and Morganstern detailed how a reliable ecomy can be constructed out of unreliable parts, placing the foundations of economics, evolution, and intelligence on common mathematical ground. ‘Unifications of fields which were formerly divided and far apart…’

p. 96 – A little before here, they talk about Godel trying to get out of Austria, then Germany and the issues with the passport office. Even though the tone is missing, when you add up how much money and effort was taken to get that guy here, you get a sense for how important he was. Very interesting detailed story on what happened.

p. 253… Just before this they are talking about the idea of the infinite possibility machine being created out of the finite. Then they give this example on what must have looked like infinity at that point in time. It’s a very interesting concept to explore further. This was in the context And another spin of how to think about Turing’s dream for the computer.

p. 314 “The Great Disaster was caused not by the Big Computer but by human beings unable to resist subverting this power to its own ends.” The context of this is a popular sci fi story about how the computer then reboots using backup data that is on like Mars. Yup. Before there was the Matrix, there was the Matrix.


Displaying 1 - 30 of 474 reviews

Can't find what you're looking for?

Get help and learn more about the design.