Jump to ratings and reviews
Rate this book

Code: The Hidden Language of Computer Hardware and Software

Rate this book
What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries.

Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines.

It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.

400 pages, Paperback

First published September 29, 1999

Loading interface...
Loading interface...

About the author

Charles Petzold

134 books183 followers
Charles Petzold has been writing about programming for Windows-based operating systems for 24 years. A Microsoft MVP for Client Application Development and a Windows Pioneer Award winner, Petzold is author of the classic Programming Windows, currently in its sixth edition and one of the best-known programming books of all time; the widely acclaimed Code: The Hidden Language of Computer Hardware and Software; and more than a dozen other books.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
5,246 (55%)
4 stars
3,023 (31%)
3 stars
986 (10%)
2 stars
194 (2%)
1 star
48 (<1%)
Displaying 1 - 30 of 782 reviews
Profile Image for Craig.
8 reviews26 followers
August 14, 2011
I'll be honest. I only read this book because it was quoted as a must read by Joel Spolsky on a stackexchange answer about how to go about learning programming (and finding out if you want/should be a programmer).

I was a little hesitant due to the year of release. Being at least some 11 years old that's a lot of time in the tech world. Ultimately though that doesn't matter. I defy any developer/programmer/system builder to read this book and not blitz through it lapping it up. Yes if you've done some schooling in computing or computer science you may be happy with much of the content but you'll surely find things you've either not thought about before in much depth or just wasn't explained in quite the elegant way that Petzold does. For me, whether it was due to age, experience or just maturity through both I found it filled gaps in my memory and indeed gaps in student course material.

Petzold opens up the world of computing through a concise linear storytelling format. Starting with a basis in Morse Code and Braille through the telegraph system, barcodes, boolean logic, circuits with memory, von neumann machines, adding peripherals, I/O devices and GUI interfaces we just about catch up to the modern era with talk of HTTP and the world wide web. Having pretty much built the systems (or simplified versions of) we're discussing in the incremental circuit and systems diagrams on the way.

Admittedly there's some rather 'of their time' phrases and facts that raise a smile (low resolutions, high costs for 'small' HD storage sizes, usage of cassette tapes by consumers) but this is all still valid information when taken in the context of the time of writing.

If you are a Developer/Programmer you're not going to go into work having had an epiphany of how better to do things, but you may have a new found respect for what you're doing and the many, many ingenious shoulders you are standing upon.
114 reviews
March 15, 2013
My opinion on this book is really divided : on the one hand I enjoy some chapters, on the other hand I hardly managed to restrain myself from flipping through other chapters. Basically, this book designs and builds a basic computer by introducing in each chapter a concept or a technology used inside computers. It was written from 1987 to 1999, consequently one shouldn't expect any description of newest technologies.

It starts really slowly with the first chapters, but then things get more and more complicated. One of the things that bother me with this book is the difference in complexity between chapters. Some chapters can be easily understood by a junior school or high school student while some of the latest chapters remind me bad memories of electronic circuits from my engineering school years. For example, a whole chapter is dedicated to explain how to communicate with your neighbour using a flashlight, an other chapter tackles the same issue with light bulbs and electrical wires, whereas all the gates or all the flip-flops are dealt with in a single chapter. I admit I have never been either fond of or good at electrokinetics, but I confess I didn't try to understand how all the electronic circuits of these later chapters work. I guess these chapters mostly interest hard code computer enthusiasts, but don't they already know these stuffs ?

Besides, few chapters are a little boring : a whole chapter to describe every op-code of Intel 8080, come on ! Does the decimal system really deserve a whole chapter ? In my opinion, decimal and alternative number systems should have been presented in a single chapter instead of two.
Moreover, the huge difference in complexity leads to some contradiction. The binary number system is so well described that a high school student can easily understand it, binary addition and subtraction are very detailed, but multiplication is done with a simple inefficient loop ! In my opinion, it would have been opportune to present at least a more efficient version based on the binary representation of the multiplicand as well as introduce exponentiation by squaring (a.k.a. square-and-multiply or binary exponentiation).
Additionally, I think that Charles Petzold tries to explain in too many details how each part works so that readers with less technical knowledge can understand, but in the end I guess these readers get lost or confused by so many details anyway, whereas a few technical references are missing. For instance, both Von Neumann and Harvard architectures are described but I don't recall them being mentioned.

Nevertheless, I really liked when the author gives historical anecdotes or references. The chapters I enjoyed the most are the ones where Charles Petzold gives readers some background history to introduce a concept or technology (for instance, Morse and Braille's codes, Bell's telegraph, the invention of telegraph relays, the evolution of transistors, chips or programming languages).

Eventually, I find it a bit contradictory for this book that most of the interesting chapters are the less technical ones indeed. Moreover, due to the important difference of knowledge required to understand chapters, I don't think someone may understand or find interesting every chapter.
Profile Image for Always Pouting.
576 reviews884 followers
March 24, 2022
I got this book because a friend was doing a book club but then I forgot about the book club. I decided to read it now so I could at least pretend like I was being productive while reading during work. I actually learnt a lot and now have context for a lot of things that I didn't understand before. I feel like knowing the history/development of any subject just helps make it more interesting and explains why things might be a certain way. I would say I don't know if this is good casual reading. I did fall asleep quite a few times but that might be because I like reading in bed under the covers. I kind of want to read something similar but for the period since this book was published, if anyone has any recommendations.
4 reviews25 followers
October 13, 2012
Raise your hand if you think metaphors and analogies should be used sparingly. I'll raise my hand with you. This book is for us.

After reading this book, I can see behind the pixels on my computer screen. I know what I'm really looking at. So many layers of abstraction are removed by learning about how logic gates can be arranged as processors and RAM, how code is simply a representation of those microscopic switches being flipped, and how pixels are simply a graphical interpretation of the state of particular switches. Moreover, I also have a little bit of an understanding of the historical evolutions these inventions and conventions went through: not just how computers work, but why they work that way and how they came to be.

The book was tougher to grasp than I thought it would be (I do not have an extensive background in electronics or programming). Although it started off easily, it became progressively more complicated except for the last chapter or two. Of course, this was to be expected, as the book began with the basic building blocks of a computer, and built progressively more complicated systems from those initial components. However, the problem wasn't really a result of the subject matter, but of the writing style, which seemed to grow more terse in later chapters. I was left with the impression that the author felt he was running out of space, which I'm sure he was; it must be difficult to keep a book with such a vast scope to a manageable size and prevent it from turning into a reference manual. I would characterize this book as grueling, but that might be because I was obstinate in making sure I fully understood every detail of every page. There were a few pages that I had to pore over repeatedly until I received a eureka moment. A few more explanatory sentences here and there would have alleviated this, but ultimately, drawing my own conclusions was very rewarding. The book seemed to recover from its gradually adopted terseness with an appreciated but sudden reference to the first chapter in the very last sentence. Someone less focused and more inclined to skim might find this book to be a bit lighter reading, but it still only took me a few days to read the whole thing.

I was surprised to see that the book did not really cover how transistors work at the electron level, which leaves what I consider to be a major gap in any understanding of how modern computers based on integrated circuits work. The text says that transistors are functionally equivalent to electromechanical relays or vacuum tubes and work similarly, but hardly any more than that. This missing knowledge is something that would have been appreciated and wouldn't have taken up much space. It seems like an especially glaring omission when juxtaposed with the inclusion of a few pages on EBCDIC, an obsolete alternative to ASCII text codes descended from paper punch cards.

Despite these minor gripes, this is a really great book, and I highly recommend it to anyone who has the interest and persistence to get through it. It teaches and ties together many mathematical and electrical concepts, and the payoff for the reader is a new perspective on computing. Despite being first published in 1999, it hardly seems dated at all, probably because it's really a history book and most of the computing history it covers happened in the 1980s and earlier. All computing history after that is basically just increasingly complex variations on those simpler foundations. A sequel would be welcome.

P.S. I think I've discovered a typo in the assembly code program on page 322. It seems to me that there should be an additional "AND A,0Fh" after the four lines of "RRC" and before the first "CALL NibbleToAscii" line. If I'm wrong, would anyone mind explaining why? And if I'm correct, would anyone mind giving me peace of mind by confirming this? Thanks! :)
Profile Image for Yevgeniy Brikman.
Author 4 books655 followers
January 20, 2018
Every single person in tech should read this book. Or if you're just interested in tech. Or if you just want a basic appreciation of one of the most important technologies in human history—the computer.

This book contains the best, most accessible explanation I've seen of how computers work, from hardware to software. The author manages to cover a huge range of topics—electricity, circuits, relays, binary, logic, gates, microprocessors, code, and much more—while doing a remarkable job of gradually building up your mental model using lots of analogies, diagrams, and examples, so just about everyone should be able to understand the majority of the book, and gain a deep appreciation of what's really happening every time you use your laptop or smartphone or read this review online.

I wish I had this book back in high school and college. I've been coding for 20 years and I still found a vast array of insights in the book. Some of the topics I knew already, and this book helped me appreciate them more; others, I knew poorly, and now understand with better clarity; still others were totally new. A small sampling of the insights:

* Current is the number of electrons flowing past a point per second. Voltage is a measure of potential energy. The resistance is how much the substance through which electricity is flowing resists the passage of those electrons. The water/pipes analogy is great: current is similar to the amount of water flowing through a pipe; voltage is similar to the water pressure; resistance is similar to the width of the pipe. I took an E&M physics course in college and while I learned all the current/voltage/etc equations, I never got this simple, intuitive understanding of what it actually means!

* We use base 10 because we have 10 fingers; a "digit," after all, is just a finger (so obvious when you actually take a second to think about it!). Had we been born with 8 fingers, like most cartoon characters, we'd probably use base 8 math. Computers use base 2 because building circuitry based on two states—the presence or absence of voltage (on and off, 1 or 0)—is much easier than circuitry based on ten states.

* The notation we use in math is essential. It's not about looking pretty or not, but actually making the math easier or harder. For example, addition and subtraction is easy in Roman numerals but multiplication and division are much harder. Arabic numerals make multiplication and division much easier, especially as they introduce a 0. Sometimes in math, you switch to different coordinate systems or different geometries to make solving a problem easier. So it's no surprise that different programming languages would have the same properties: while any language can, in theory, solve the same problems as any other, in practice, some languages make certain problems much easier than others.

* This book does a superb job of showing how logic gates (AND, OR, etc) can be built from simple physical circuits—e.g., from relays, which are much easier to imagine and think about than, for example, transistors—and how easy it is to do math with simple logic gates. I remember learning this back in college, but it still amazes me every time I see it, and with the crystal-clear examples in this book, I found myself smiling when I could picture a simple physical circuit of relays that could do arithmetic just by entering numbers with switches and passing some electricity through the system (e.g., to add, you have a sum and a carry, where the sum is an XOR and the carry is an AND).

* The explanation of circuits that can "remember" (e.g., the memory in your computer) was superb and something I don't remember learning at all in college (how ironic). I love the idea that circuits with memory (e.g., latches) work based on a feedback mechanism: the output of the circuit is fed back into the same circuit, so if it gets into one state (e.g., on, because electricity is flowing through it), that feedback mechanism keeps it in that state (e.g., by continuing to the flow of electricity through it), effectively "remembering" the value. And all of this is possible because it takes a finite amount of time for electricity to travel through a circuit and for that circuit to switch state.

* The opcodes in a CPU consist of an operation to perform (e.g., load) and an address. You can write assembly code to express the opcodes, but each assembly instruction is just a human-friendly way to represent an exactly equivalent binary string (e.g., 32 or 64 binary digits in modern CPUs). You can enter these opcodes in manually (e.g., via switches on a board that control "on" and "off") and each instruction becomes a high or low voltage. These high and low voltages pass through the physical circuitry of the CPU, which consist of logic gates. Based purely on the layout of these logic gates, voltage comes out the "other end," triggering new actions: e.g., they may result in low and high voltages in a memory chip that then "remembers" the information (store) or returns information that was previously "remembered" (load); they may result in low and high voltages being passed to a video adapter that, based on the layout of its own logic gates, results in an image being drawn on a screen; or they may result in low and high voltages being fed back into the CPU itself, resulting in it reading another opcode (e.g., perhaps from ROM or a hard drive, rather than physical switches), and repeating the whole process again. This is my lame attempt at describing, end-to-end, how software affects hardware and results in something happening in the real world, solely based on the "physical layout" of a bunch of circuits with electricity passing through them. I think there is something magical about the fact that the "shape" of an object is what makes it possible to send emails, watch movies, listen to music, and browse the Internet. But then again, the "shape" of DNA molecules, plus the laws of physics, is what makes all of life possible too! And, of course, you can't help but wonder what sort of "opcodes" and "logic gates" are used in your brain, as your very consciousness consists entirely of electricity passing through the physical "shape" of your neurons and the connections between them.

There are a few places the book seems to go into a little too much detail—e.g., going over all the opcodes of a specific Intel CPU—and a few places where it seems to skip over all the important details—e.g., the final chapter on modern software and the web—but overall, I have not found another book anywhere that provides as complete of a picture of how a computer works. Given the ubiquity of computers today, I'd recommend this book to just about everyone. It'll make you appreciate just how simple computers really are—and how that simplicity can be used to create something truly magical.





As always, I've saved a few of my favorite quotes from the book:

A computer processor does moronically simple things—it moves a byte from memory to register, adds a byte to another byte, moves the result back to memory. The only reason anything substantial gets completed is that these operations occur very quickly. To quote Robert Noyce, “After you become reconciled to the nanosecond, computer operations are conceptually fairly simple.”

The first person to write the first assembler had to hand-assemble the program, of course. A person who writes a new (perhaps improved) assembler for the same computer can write it in assembly language and then use the first assembler to assemble it. Once the new assembler is assembled, it can assemble itself.
Profile Image for Igor Ljubuncic.
Author 17 books250 followers
June 28, 2018
This is a great book. Surprisingly interesting.

While the subject matter is not a new thing to me - far from it - the way the author goes about telling the story of how modern computers came to life is exciting, engaging and fun. He starts with morse and braille, talks about the principles of mathematics and information, explains the critical concept of switches, and finally moves into the world of circuit boards and binary data, cultimating in ALU. After that, he discusses the idea of analytical and computational engines and machines developed through the late 19th and early 20th century, before we finally start seeing the modern computer around 1940s, with Turing and von Neumann laying down the foundations of what we know and use today.

The book is really cool because it's also a nostalgic trip down the memory lane. Charles mentions the famous Bell Labs, the legendary Shannon, Ritchie, Noyce, Moore, UNIX, C language, and other people and concepts without which we would not be sitting here, writing reviews on Goodreads. Or we might, but the fundamentals of the computing devices would be completely different.

Computers sound like magic, but the thing is, they are a culmination of 150 years of electric progress, 200 years of data/information progress, and about 350 years of math progress. The first boards, the first programs, the first assembler and the first compiler, they were all written by hand. Control signals are still essentially the same, and if you look at a typical x86 Intel processor, the legacy support for machine instructions goes back to the first microprocessor. The problem is, when you condense the centuries of hard work into a cool, whirring appliance, it does feel like magic.

The author wrote the book in the late 80s and then revised it in the late 90s, so some of the stuff may look quaint to us, like the mention of floppy disks, VGA displays and such. But then he also shows uncanny foresight around overall information exchange, because the information principles are universal, and he correctly predicted that Moore's Law would taper out around 2015.

He also cheated a little.

He described the flip-flop as a perpetuum mobile, which can be sort of excused, and he also skimmed on the concepts of oscillators, transistors (and did not mention capacitors), but then those are fairly complex, and I guess it's not really possible to do that without going deep into the fields of physics and electric engineering. Excusable, because the book is compelling and delightful.

Even if you have a PhD in Physics from a top university or have done computer science all your life, you can rap in ASM and name all LoTR characters by heart, this is still a good read. Do not feel like you'd be highschooling yourself with silly analogies. Far from it. This is a splendid combo of history, technology, mathematics, information, and nostalgia.

Highly recommended,
x49 x67 x6F x72
Profile Image for Mark Seemann.
Author 4 books451 followers
September 3, 2022
I'll start with my review of the second edition, and keep my original review of the first edition below.

My editor at Pearson contacted me in late 2021 asking me if I'd like to do a technical review of Code, 2nd edition. I loved the first edition (see original review, below), so I didn't hesitate much, even though doing a technical review of a manuscript is much more work than 'just' reading a book.

While 'rereading' the manuscript, I was constantly struck by how well-organised it is. The examples are clear and concepts are organised so that each new level of abstraction follows naturally from what you've just read. You might say that all technical books should be organised like that, but they aren't. As an author myself I can say that doing that is extraordinarily difficult. That Charles Petzold pulls it off with so seeming ease with such a complex topic empresses me to no end.

The second edition is reorganised and expanded compared to the first edition, and while the first edition was great, the second edition is a significant improvement. It really is a tour de force, showcasing both Petzold's technical chops but also his immense skill as an educator.

The book still explains, from first principle, how a computer works, up to and including how machine code works. If you're the least bit interested in understanding the computer, the foundation of our modern information society, you should read this book.

My original review of the first edition, written June 28 2018:

Since I loved Charles Petzold's The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine, I wondered if he'd written other books about the foundations of computer science. Code seemed like an obvious candidate.

This book explains, in as much details as you could possibly hope, and then some, how a computer works.

Since I've been a professional software developer for about two decades, the title of the book, Code, gave me an impression that it'd mostly be about the logic of software - something that I already know intimately. The first chapters seemed to meet my expectations with their introductions to binary and hexadecimal numbers, Boolean logic, and the like.

Soon, though, I was pleasantly surprised that the book was teaching me something I didn't know: how a computer actually works. It starts by explaining how one can construct logic gates from relays, and then builds on those physical creations to explain how logic gates can be used to add numbers, how RAM works, and so on!

Like The Annotated Turing, this part of the book met me at exactly the level I needed. So technical, with no shortcuts, and nothing swept under the rug, that I felt that I deeply understand how things work, but still thrilling and engaging. Whenever I found myself with a question like, ...but how about..? or ...but what if..?, I soon learned to read on with confidence because I consistently found answers to my questions two or three paragraphs further on.

The final part of the book, again, moves into territory that should be familiar for any programmers, such as ASCII, high-level programming languages, graphical user interfaces, and such, and that, unfortunately, bored me somewhat. Thus, the overall reading experience was uneven, which is why I only give it four stars.

Would someone who's not a professional programmer rate it higher? I don't know. I could imagine that for some, the explanation of logical gates, adders, latches, etc. made from relays would be too technical.
Profile Image for Alex Telfar.
106 reviews90 followers
March 24, 2018
Very close to my ideal book. Starts from understandable foundations and builds from there. Charles doesnt try to explain through high level metaphors (that do a poor job of capturing the truth -- I am frustrated after picking up another apparently interesting physics book only to find it contains no math), rather, he slowly builds on simple examples. And while it does get pretty complex, Charles doesnt avoid it. !!!

For a while I have been frustrated about my understanding of computers. I understood how bits can encode information, what the von Neumann architecture was and some of it flaws, how programming languages are compiled to assembly/machine code, what transistors are and how to make logical circuits. But I could never really link them together. I am still a little hazy, and I think I will have to go over a couple of chapters from no. 17 onward (automation, buses, OS) just to cement and clarify, but understanding feels close.

More thoughts to come on my blog. Just drafting atm.
Profile Image for Bozhidar.
25 reviews79 followers
August 6, 2022
I first came across this book around 2005, while studying C# at the university. Back then I was reading a recommended book on C# programming by Charles Petzold, which I liked, and I looked into other books written by him. "Code" immediately caught my attention but for various reasons I didn't buy the book for 10 more years and I didn't read it until this month.

Well, the long wait was certainly worth it as this was one of the best introductions to computing I've ever read! The author guides us through the process of building a simple computer from scratch in an extremely detailed, yet entertaining way, and we learn a lot about a lot of topics as we go along. That's probably one of the best texts on the history of computing as well, and modern electronics/communication tech in general.

So, why 4 stars instead of 5 then? Well, the book quality varied with chapters and I definitely didn't enjoy the final chapters as much as the earlier ones. Perhaps because those were the most outdated (e.g. the one on peripherals), or simply because that's the part of computing I didn't really need a refresher on (operating systems, programming languages, etc). Still, I have to admit the book has aged very well (it was written way back in 1999!) and most of its content is pretty timeless.

Funny enough, Charles Petzold has recently announced a second edition of "Code" that's due any day now - mid-August 2022. Let's see if I'll manage to read it in less than 17 years this time around!
Profile Image for Jan Martinek.
64 reviews32 followers
August 15, 2016

What a ride! A book about computers “without pictures of trains carrying a cargo of zeroes and ones” — the absolute no-nonsense book on the internals of the computer. From circuits with a battery, switch and bulb to logic gates to a thorough description of the Intel 8080. Great way to fill blanks in my computer knowledge.

The book takes the approach of constructing the computer “on the paper and in our minds” — that's great when you're at least a little familiar with the topic, maybe not so when trying to discover a completely unknown territory (but the author takes great lengths to go through everything step by step — e. g. the various gates, binary subtraction, memory handling, etc.).

In a way, this is a perfect book on the topic. If you know a better one, I want to read it.

10 reviews
February 29, 2012
I have been an IT professional for 20 years, but I never knew what the switches on the front panel of the Altar computer were for. I do now.

In fact, because of this book, I know many things about how a computer really works that I never did before. I think this book is great for anyone, except Electrical Engineers who would be bored. Having some background in computers probably makes this book easier to get through, but Petzold assumes nothing and starts from scratch. He does a good job of making potentially dry subjects fairly interesting.

I think an update to this book would be great, because the discussion of 1999 capacity and pricing makes the book feel dated. Also, the last chapter seemed rushed and not as well focused as the rest of the book.

So, if you want to know how any computer really works, read this book.
Profile Image for Laura Marelic.
29 reviews35 followers
May 19, 2020
This book is the perfect depth for novices but also people who are “in tech” and don’t really understand how it all works (like me). I can now look around at all the electronics in my house and feel like I know what’s fundamentally going on. Knowledge is empowering! The last chapter of the book felt a bit rushed and ended abruptly, but maybe that’s just my wanting the book to go on longer/end at present day. Overall, I loved it and will surely be recommending it to anyone who asks how computers work. 👩🏻‍💻🤖👾

Oh, also I am simultaneously reading The Innovators (Isaacson) on audio and the two books pair very nicely. It was great to read about the tech in Code and then the story of who’s behind it in The Innovators. I recommend this pairing!
Profile Image for Carlos Martinez.
364 reviews304 followers
March 6, 2019
Such a fun and interesting book. Petzold goes back to the very basics to explain how to build a computer (of sorts) from the ground up. First he explains binary (via morse code and Braille), then he introduces relays and switches, then gates and Boolean logic, and before you know it you're building an electronic counting machine. He continues with a potted history of transistors, microchips, RAM, ROM, character encoding and all sorts of other fun stuff.

I skipped over some pages, because I don't actually need to know the full set of opcodes for a 1970s CPU, no matter how significant they are to computing history.

The only obvious 'flaw' is that the book has aged a bit. Written in 2000, it just about manages a mention of the internet/HTTP/TCP-IP and modems, but not wifi, cloud computing, touchscreen devices, and the brave new world of machine learning. Personally I don't think that detracts from the book at all - the really interesting stuff runs from around 1870 to 1970.

Definitely recommended for those that didn't study (or don't remember much) computer science.
Profile Image for Jule.
86 reviews8 followers
August 17, 2009
I LOVE this book. I regard myself an innocent computer illiterate. And Petzold helps me to walk inside an electrical circuit, a telephone, a telegraph, an adding machine, a computer, and to understand the basics behind the design, of what is going on inside. I start getting the math, the logic behind all this technology that has become pretty much the center of my life today. And I should understand the logic behind the center of my life, right?

What is so good about this book: it is written in a simple language anyone can understand. It uses examples that are entertaining and amusing. Like explaining an electrical circuit with AND, OR, NOR and NAND gates to pick your favourite kitty from a bunch of neutered, unneutered, black, white, brown, tan, male and female cats in their various combinations. Also, he interlinks the historical evolution to the logic and development of technology as we use it today, so you get pretty much a round picture of the whole thing. Love it!
Profile Image for Baq.
70 reviews
April 8, 2017
Wow. I wish I had had this book back when I was taking my first Computer Architecture course in college! It carries you along from the very fundamentals of both codes (like braille) and electric circuits in the telegraph days all the way to the web in a way that even a layperson could understand, with plenty of verbal and diagrammatic explanation. It does at points get pretty deep into the weeds but I really appreciated the author's efforts to provide such an exhaustive dive into how computers work (and I regained much of my awe at these machines we take so for granted nowadays).

The final chapter was a rushed overview of the web and felt almost like an afterthought after the thoroughness of the rest of the book, but I didn't ding the author on it--there's plenty of great writing about how the web works that you can read elsewhere. Read this book to gain a deeper understanding and appreciation for the birth of the modern digital age. Thank you Charles Petzold!
Profile Image for Martin Lumiste.
38 reviews11 followers
February 24, 2021
What an absolute gem I found in some Stack Overflow comments! Deriving modern computers from first principles such as Morse code and electromagnets, it's a rare book that can connect with you regardless of prior technical expertise.

Would mostly recommended for software professionals looking to ground themselves in how it all started. This doesn't mean it's a history book: the insight into hardware and software components is as relevant today as it was in 2000.

Truly enjoyable read throughout.
9 reviews1 follower
June 15, 2008
In brief: be prepared to skim through at least 25% of this book! If I had this book in a seminar freshman year, I might have completed the Computer Science program. In a very fun manner, this book presents 3 years of introductory CS curricula: discrete structures, algorithms, logic gates, ... After reading this during two cross-country flights, I better understand (and remember) classes I took 10 years ago. Almost makes me want to try again (*almost*).
232 reviews2 followers
September 27, 2021
Starts out really good. The ends up a little in the weeds describing how to make a computer.
Profile Image for Jop Wolffs.
40 reviews1 follower
November 1, 2022
(2022 version)

I wanted to love this book, and to some extent I did, but I also felt some frustrations that might be inherent to its goal.

This is an *extremely* accessible book on the basics of how computers work. From flashlight signals to Morse code, logic gates, relays, all the way up to the actual 'computing' circuits and RAM memory, it paints a complete picture requiring no prior knowledge whatsoever. The drawback is that many people are familiar with at least some of these concepts, and might as well skip the related chapters. The other problem is that, however you explain them, computer circuits are complicated. As the circuits get progressively closer to a (basic) computer, the torrent of circuit diagrams and how they work gets tedious and requires studious examination or a very good memory to fully understand. The author does an admirable job leading the reader, but at the end of the day this is the subject matter of textbooks, and probably will never fit neatly into informal reading. This is probably why, after describing a basic processor, the author starts glossing over details to introduce long-term storage, peripherals and the internet.

That said, those who don't care about or notice my issues, and those who read on regardless, will learn so much that they're practically capable of building a computer from scratch (albeit a computer from the 1980's). The writing is also light and clear throughout. In the end, I can only conclude that it is very good at that it set out to do.
Profile Image for Imi.
378 reviews139 followers
September 14, 2019
This book has really taught me a lot, despite the fact that many of the later chapters lost me somewhat; it felt like it became much more complicated and hard to follow after the earlier chapters, which were great, slowly paced and well explained. While Petzold does assume the reader is starting from scratch, I think it would be easier to follow later on if you had some background in computers/technology. As it was, I had to bombard my dad (an electronic engineer) with questions to even make it to the end of some chapters, but then I haven't attended regular maths/science classes since about age 14, so maybe it's not surprising that I'm missing some of the needed background information.

It is outdated, having been written in 1999, but I guess the history, which Petzold follows nearly chronologically, hasn't changed, and the early history is necessary to understand what has come since this book was written. Having said that, the last chapter (on the 'graphical revolution') was strangely rushed and an updated edition would do it some good, I think.

Even if I couldn't grasp all of the technical detail, the majority of this book was extremely eye-opening and I have definitely come away from it with new found respect for these devices that we now use day-to-day. Even while using this laptop to complete a supposedly "simple" task such as writing this review, I am fascinated by how much work has gone on behind the scenes to allow me to do this. It's fairly awe-inspiring, the more you think about it.
174 reviews
August 1, 2010
This book basicaly tries to take you from the very basics of how to encode information, such as how binary is used to represent complex information, to understanding how a computer uses information like this to perform intricate operations. The route between those two points is the interesting part, and there was some parts that I foudn really illuminating and important. For example, I didn't understand hexadecimal numbers (or indeed what base 4, base 8, etc) numbers meant before I read this book. Similarly I knew a fair amount about how various electrical gates work but not how by pairing multiple gates together you eventually get to RAM, a CPU, etc.

It did lose me at times, however, and I zoned out a bit when Petzold was talking about the way in which math calculations are carried out using gates and binary information. I probably should have paid more attention, because this is fundamental to understanding how higher level systems work.

I really enjoyed the explanatuon of how certain chipsets were important, especially the 8080 and the 6800, and then the creation of assembly language and compilers. Most striking to me was the realisation that modern computing is essentially a brute force operation. We are using the same switches that were invented 150 years ago or so but now they are gigantically faster, smaller and on a exponentially more massive scale.

Profile Image for Rick Sam.
406 reviews124 followers
August 25, 2021
1. Why read this Work?

Let's see, if you want to know, how Computers work. We are on our phones, laptops, yet we do not know, how it works?


2. What is the key idea in this Work?

Abstraction, starting from Zero, and One -- Can you believe it?


3. What are my thoughts on this?

An excellent book to learn about bottom-up details of Computers.

It starts with two friends trying to communicate with each other, Morse Code, Braille, Flashlights... Gates, ... Assembly Language, Operating System and finally Graphical Revolution.

Now, in late 2020's, SaaS, Web Products are being built, incredible.

Overall, this would help you to understand from First Principles.

What an impressive accomplishment and progress!

I would recommend this to anyone interested in Software, Hardware, Computer Science.


Deus Vult,
Gottfriend
Profile Image for Alb85.
295 reviews9 followers
December 6, 2021
Ottimo libro che ha l'ambizioso obiettivo di spiegare come è fatto un computer, dalla A alla Z. Nella prima metà vengono spiegati dei concetti specifici in modo dettagliato e chiaro. Nella seconda metà del libro la complessità dei concetti aumenta ed ovviamente diventa difficile descriverli con completezza in poche pagine. L'autore fa di tutto per facilitare la comprensione del testo, raccontando per esempio aneddoti su come si è evoluto il computer, inserendo un sacco di immagini e tabelle, ed arricchendo il tutto con del buon umorismo. Libro molto denso di concetti e scritto molto ma molto bene.

SPUNTI INTERESSANTI:

Sull'elettricitá:
- Lightning is a lot of electrons moving very quickly from one spot to another.
- In all batteries, chemical reactions take place, which means that some molecules break down into other molecules, or molecules combine to form new molecules. The chemicals in batteries are chosen so that the reactions between them generate spare electrons on the side of the battery marked with a minus sign (called the negative terminal, or anode) and demand extra electrons on the other side of the battery (the positive terminal, or cathode). In this way, chemical energy is converted to electrical energy.
The reactions take place only if an electrical circuit is present to take electrons away from the negative side and supply electrons to the positive side. The electrons travel around this circuit in a counterclockwise direction
- But why do we need the wires? Can't the electricity just flow through the air? Well, yes and no. Yes, electricity can flow through air (particularly wet air), or else we wouldn't see lightning. But electricity doesn't flow through air very readily.
- An atom that has just one electron in its outer shell can readily give up that electron, which is what's necessary to carry electricity. These substances are conducive to carrying electricity and thus are said to be conductors. The best conductors are copper, silver, and gold. It's no coincidence that these three elements are found in the same column of the periodic table. Copper is the most common substance for making wires. The opposite of conductance is resistance. Some substances are more resistant to the passage of electricity than others, and these are known as resistors. If a substance has a very high resistance—meaning that it doesn't conduct electricity much at all—it's known as an insulator. Rubber and plastic are good insulators, which is why these substances are often used to coat wires. Cloth and wood are also good insulators as is dry air. Just about anything will conduct electricity, however, if the voltage is high enough.
Copper has a very low resistance, but it still has some resistance. The longer a wire, the higher the resistance it has. If you tried wiring a flashlight with wires that were miles long, the resistance in the wires would be so high that the flashlight wouldn't work.
The thicker a wire, the lower the resistance it has. This may be somewhat counterintuitive. You might imagine that a thick wire requires much more electricity to "fill it up." But actually the thickness of the wire makes available many more electrons to move through the wire.
- Voltage refers to a potential for doing work. Voltage exists whether or not something is hooked up to a battery. A much easier concept in electricity is the notion of current. Current is related to the number of electrons actually zipping around the circuit.
- The water-and-pipes analogy helps out here: Current is similar to the amount of water flowing through a pipe. Voltage is similar to the water pressure. Resistance is similar to the width of a pipe—the smaller the pipe, the larger the resistance. So the more water pressure you have, the more water that flows through the pipe. The smaller the pipe, the less water that flows through it. The amount of water flowing through a pipe (the current) is directly proportional to the water pressure (the voltage) and inversely proportional to the skinniness of the pipe (the resistance).
- In electricity, you can calculate how much current is flowing through a circuit if you know the voltage and the resistance. Resistance—the tendency of a substance to impede the flow of electrons—is measured in ohms, who also proposed the famous Ohm's Law. The law states I = E / R where I is traditionally used to represent current in amperes, E is used to represent voltage, and R is resistance.
- let's look at a battery that's just sitting around not connected to anything. The voltage E is 1.5. But because the positive and negative terminals are connected solely by air, the current is just about zero. Now let's connect the positive and negative terminals with a short piece of copper wire. Lots and lots of electrons will be flowing through the wire. In reality, the actual current will be limited by the physical size of the battery. The battery will probably not be able to deliver such a high current, and the voltage will drop below 1.5 volts. If the battery is big enough, the wire will get hot because the electrical energy is being converted to heat. If the wire gets very hot, it will actually glow and might even melt. If a wire has a low resistance, it can get hot and start to glow. This is how an incandescent lightbulb works. Inside a lightbulb is a thin wire called a filament, which is commonly made of tungsten. One end of the filament is connected to the tip at the bottom of the base; the other end of the filament is connected to the side of the metal base, separated from the tip by an insulator. The resistance of the wire causes it to heat up. In open air, the tungsten would get hot enough to burn, but in the vacuum of the lightbulb, the tungsten glows and gives off light. Most common flashlights have two batteries connected in series. The total voltage is 3.0 volts. A lightbulb of the type commonly used in a flashlight has a resistance of about 4 ohms. Thus, the current is 3 volts divided by 4 ohms, or 0.75 ampere, which can also be expressed as 750 milliamperes. This means that 4,680,000,000,000,000,000 electrons are flowing through the lightbulb every second.
- The watt is a measurement of power (P) and can be calculated as P = E x I
- One thing we learned about conductors is this: The larger the better. A very thick wire conducts much better than a very thin wire. That's where the earth excels. It's really, really, really big.

Sui numeri binari:
- The binary number system bridges the gap between arithmetic and electricity. In previous chapters, we've been looking at switches and wires and lightbulbs and relays, and any of these objects can represent the binary digits 0 and 1
- The sum of two binary numbers is given by the output of an XOR gate, and the carry bit is given by the output of an AND gate.
- When used in computers, transistors basically function the same way relays do, but (as we'll see) they're much faster and much smaller and much quieter and use much less power and are much cheaper.

Sugli oscillatori:
- The frequency of the oscillator is 1 divided by the period. In this example, if the period of the oscillator is 0.05 second, the frequency of the oscillator is 1 ÷ 0.05, or 20 cycles per second. Twenty times per second, the output of the oscillator changes and changes back.
Cycles per second is a fairly self-explanatory term, much like miles per hour or pounds per square inch or calories per serving. But cycles per second isn't used much any more. In commemoration of Heinrich Rudolph Hertz (1857–1894), who was the first person to transmit and receive radio waves, the word hertz is now used instead.
- A flip-flop circuit retains information. It "remembers." In particular, the flip-flop shown previously remembers which switch was most recently closed. If you happen to come upon such a flip-flop in your travels and you see that the light is on, you can surmise that it was the upper switch that was most recently closed; if the light is off, the lower switch was most recently closed. Although it might not be apparent yet, flip-flops are essential tools. They add memory to a circuit to give it a history of what's gone on before. Imagine trying to count if you couldn't remember anything.

Sui Byte:
- The word byte originated at IBM, probably around 1956. The word had its origins in the word bite but was spelled with a y so that nobody would mistake the word for bit. For a while, a byte meant simply the number of bits in a particular data path. But by the mid-1960s, in connection with the development of IBM's System/360 (their large complex of business computers), the word came to mean a group of 8 bits.
- It turns out that 8 is, indeed, a nice bite size of bits. The byte is right, in more ways than one. One reason that IBM gravitated toward 8-bit bytes was the ease in storing numbers in a format known as BCD (which I'll describe in Chapter 23). But as we'll see in the chapters ahead, quite by coincidence a byte is ideal for storing text because most written languages around the world (with the exception of the ideographs used in Chinese, Japanese, and Korean) can be represented with fewer than 256 characters. A byte is also ideal for representing gray shades in black-and-white photographs because the human eye can differentiate approximately 256 shades of gray. And where 1 byte is inadequate (for representing, for example, the aforementioned ideographs of Chinese, Japanese, and Korean), 2 bytes—which allow the representation of 216, or 65,536, things—usually works just fine.

Sulla matematica:
The Scottish mathematician John Napier (1550–1617) ... invented logarithms for the specific purpose of simplifying these operations. The product of two numbers is simply the sum of their logarithms. So if you need to multiply two numbers, you look them up in a table of logarithms, add the numbers from the table, and then use the table in reverse to find the actual product.

Sui computer:
- But beginning in the early 1940s, vacuum tubes began supplanting relays in new computers. By 1945, the transition was complete. While relay machines were known as electromechanical computers, vacuum tubes were the basis of the first electronic computers.
- It wasn't until the mid-1950s that magnetic core memory was developed. Such memory consisted of large arrays of little magnetized metal rings strung with wires. Each little ring could store a bit of information.
- The transistor inaugurated solid-state electronics, which means that transistors don't require vacuums and are built from solids, specifically semiconductors and most commonly (these days) silicon. Besides being much smaller than vacuum tubes, transistors require much less power, generate much less heat, and last longer.
- Vacuum tubes were originally developed for amplification, but they could also be used for switches in logic gates. The same goes for the transistor.
- Transistors certainly make computers more reliable, smaller, and less power hungry. But do transistors make computers any simpler to construct?
Not really. The transistor lets you fit more logic gates in a smaller space, of course, but you still have to worry about all the interconnections of these components. It's just as difficult wiring transistors to make logic gates as it is wiring relays and vacuum tubes.

Sui monitor:
- Around 1889, when Edison and his engineer William Kennedy Laurie Dickson were working on the Kinetograph motion picture camera and the Kinetoscope projector, they decided to make the motion picture image one-third wider than it was high. The ratio of the width of the image to its height is called the aspect ratio. The ratio that Edison and Dickson established is commonly expressed as 1.33 to 1, or 1.33:1, or, to avoid fractions, 4:3.

Sulla memoria:
- The most obvious difference between memory and storage is that memory is volatile; it loses its contents when the power is shut off. Storage is non-volatile; data stays on the floppy disk or hard disk until it's deliberately erased or written over. Yet there's another significant difference that you can appreciate only by understanding what a microprocessor does. When the microprocessor outputs an address signal, it's always addressing memory, not storage.

Sui numeri:
- Beyond whole numbers, mathematicians also define rational numbers as those numbers that can be represented as a ratio of two whole numbers. This ratio is also referred to as a fraction.
- Irrational numbers are monsters such as the square root of 2. This number can't be expressed as the ratio of two integers
- If a number is not a solution of any algebraic equation with whole number coefficients, it's called a transcendental. (All transcendental numbers are irrational, but not all irrational numbers are transcendental.)
- This type of storage and notation is also called fixed-point format because the decimal point is always fixed at a particular number of places—in our example, at two decimal places. Notice that there's nothing actually stored along with the number that indicates the position of the decimal point. Programs that work with numbers in fixed-point format must know where the decimal point is.
- Scientific notation is particularly useful for representing very large and very small numbers because it incorporates a power of ten that allows us to avoid writing out long strings of zeros.
- In computers, the alternative to fixed-point notation is called floating-point notation, and the floating-point format is ideal for storing small and large numbers because it's based on scientific notation.
- In decimal scientific notation, the normalized significand should be greater than or equal to 1 but less than 10. Similarly, the normalized significand of numbers in binary scientific notation is always greater than or equal to 1 but less than binary 10, which is 2 in decimal.

Sui programmi:
- it's not enough to simply define a high-level language (which involves developing a syntax to express all the things you want to do with the language); you must also write a compiler, which is the program that converts the statements of your high-level language to machine code. Like an assembler, a compiler must read through a source-code file character by character and break it down into short words and symbols and numbers. A compiler, however, is much more complex than an assembler. An assembler is simplified somewhat because of the one-to-one correspondence between assembly-language statements and machine code. A compiler usually must translate a single statement of a high-level language into many machine-code instructions. Compilers aren't easy to write.
- Many subsequent implementations of BASIC have been in the form of interpreters rather than compilers. As I explained earlier, a compiler reads a source-code file and creates an executable file. An interpreter, however, reads source code and executes it directly as it's reading it without creating an executable file. Interpreters are easier to write than compilers, but the execution time of the interpreted program tends to be slower than that of a compiled program

Su internet:
- The telephone system is built to transmit sound, not bits, over wires. Sending bits over telephone wires requires that the bits be converted to sound and then back again. A continuous sound wave of a single frequency and a single amplitude (called a carrier) doesn't convey any substantial information at all. But change something about that sound wave—in other words, modulate that sound wave between two different states—and you can represent 0s and 1s. The conversion between bits and sound occurs in a device called the modem (which stands for modulator/demodulator). The modem is a form of serial interface because the individual bits in a byte are sent one after another rather than all at once.
- While much of this book has focused on using electricity to send signals and information through a wire, a more efficient medium is light transmitted through optical fiber—thin tubes made of glass or polymer that guide the light around corners. Light passing through such optical fibers can achieve data transmission rates in the gigahertz region—some billion of bits per second.
Profile Image for Kenny Kidd.
168 reviews3 followers
January 29, 2023
Some of THE most fun I’ve had learning about something in a very, very long time; there was a period of time where I was super curious about, simply, HOW computers work (like how does pressing a bunch of keys pull up an entire movie on a small screen that fits in your lap with nothing more than, like, electric signals? Like how is that not magic?), and this clears up the fundamental processes of computing PERFECTLY.

More impressively—It’s 380 pages long and not a single paragraph feels superfluous or unnecessary (it turns out computers are immensely complicated and have a very rich history 🤷‍♂️ Who’d a thunk it)! And in the process of explicating the historical development/operations of a computer up until about 1999, this book goes through:

•A brief history of codes/language/efficient communication, teaching one Morse and Braille if they’re interested.
•The invention of electric circuits and telegraphs (which it turns out is essentially what a computer IS).
•Binary and Hexadecimal Code.
•The logic gates that you can construct from electric circuits, how they relate to Boolean Algebra, and how practically these can be used for operations in binary code.
•How a computer is, at its core, a calculator with the ability to perform conditional jump operations.
•How to construct a RAM, how technology developed from electric circuits to silicon chips to rapidly speed up computing, and how television technology was implemented to make computers way easier to access.
•What an OS is, how computers’ microprocessors vary, how binary code can produce text and images on screen (and how much data these images, especially in color, take to make), the purposes of different computer languages and their functions.
•And this is just a rough recap of some of the more memorable stuff from looking through the table of contents again 🤷‍♂️

By the end of this book, it felt like I had taken a two-semester course on computer hardware and software, but taught by someone with a true love for the magic and ingenuity of computing :) A thrilling read, even as the chapters by the end get incredibly, rivetingly complex (the content builds on itself to the point where it feels like a miracle that the last few chapters are even comprehensible, due to how tech-heavy and intricately layered the topics/concepts have gotten by then).

SO fun, loved it, computers are awesome 😌
Profile Image for Bay Gross.
94 reviews13 followers
May 9, 2022
4 stars.

A great read if society collapses and you need to accelerate the rebuild of modern computing from scratch! Sort of like Connecticut Yannkee in King Arthur’s Court, but for IBM.

Petzold goes all the way back to morse code and telegraph relays in order to methodically build up a step-by-step mental model of modern day computing.

For techies: this is a fun and fast read that will “fill in a lot of gaps” and give you a clean narrative from atoms to the world of bits. It goes a few abstraction layers deeper than the typical university computer science curriculum, which tend to pick up with vonNeumann architectures but gloss over the century of transistors and relays that got us there. (that's how I first found it!)

For non-techies: this is a fun but challenging read that will make you more conversant in the foundations of computing and how the 20th century came to be. It has a lot more rigor than your typical pop sci novel, but rewarding to get through.


Narrative arc:
He walks us through the parallel advancements in hardware:
- gates
- relays (esp: telegraph relays)
- vacuum tubes
- transistors (the most important invention of 20th century)
- microprocessors

and software:
- morse code
- binary
- boolean logic
- turing machines
- von neumann architectures
- operating systems
- higher order languages

Along the way you conceptually learn how to build a vonNeumann machine from scratch, including deep technical detail into how to program binary adders and memory registers, etc.
Profile Image for Tomas.
48 reviews4 followers
January 9, 2022
Basically a better version of my computer architecture class from undergrad… extremely thorough, easy to understand, a great resource for everyone interested, imo. Not exactly a breeze to read through any technical book, but this was still pretty good. Learned half way through that it came out in 2000ish so it was also a cool artifact to see what the author mentioned about modern computing and predicted for the future. Additionally learned that the author wrote the OG programming windows book and is a total legend. Very cool
Profile Image for Polosanya.
3 reviews
January 12, 2021
Очень доступно и интересно для тех, кто не смотрел Фиксиков, но хочет знать как устроен компьютер и что такое программирование 👍
Profile Image for Grace Tueller.
18 reviews5 followers
March 21, 2022
Took me literal months and there were entire chapters that were nearly incomprehensible to me...but I finished!!!
Displaying 1 - 30 of 782 reviews

Can't find what you're looking for?

Get help and learn more about the design.