Jump to ratings and reviews
Rate this book

The Chip : How Two Americans Invented the Microchip and Launched a Revolution

Rate this book
Barely fifty years ago a computer was a gargantuan, vastly expensive thing that only a handful of scientists had ever seen. The world’s brightest engineers were stymied in their quest to make these machines small and affordable until the solution finally came from two ingenious young Americans. Jack Kilby and Robert Noyce hit upon the stunning discovery that would make possible the silicon microchip, a work that would ultimately earn Kilby the Nobel Prize for physics in 2000. In this completely revised and updated edition of The Chip , T.R. Reid tells the gripping adventure story of their invention and of its growth into a global information industry. This is the story of how the digital age began.

336 pages, Paperback

First published January 1, 1984

Loading interface...
Loading interface...

About the author

T.R. Reid

12 books90 followers
T.R. Reid is a reporter, documentary film correspondent and author. He is also a frequent guest on NPR's Morning Edition. Through his reporting for The Washington Post, his syndicated weekly column, and his light-hearted commentary from around the world for National Public Radio, he has become one of America’s best-known foreign correspondents.

Reid, a Classics major at Princeton University, served as a naval officer, taught, and held various positions before working for The Washington Post. At the Post he covered congress and four Presidential election campaigns, and was chief of the Post's London and Tokyo bureaus. He has also taught at Princeton University and the University of Michigan. His experiences in Japan led him to write Confucius Lives Next Door: What Living in the East Teaches Us About Living in the West, which argued that Confucian values of family devotion, education and long-term relations, that still permeate East Asian societies, contributed to their social stability.

He is now the Post's Rocky Mountain Bureau Chief. A 2007 Kaiser Family Foundation media fellow in health, he is a member of the board of the Colorado Coalition for the Homeless and the University of Colorado Medical School.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
375 (46%)
4 stars
294 (36%)
3 stars
112 (13%)
2 stars
19 (2%)
1 star
3 (<1%)
Displaying 1 - 30 of 87 reviews
Profile Image for Eric_W.
1,932 reviews388 followers
April 8, 2010
Technophobes might as well move on to the next review. I loved this book. It explained in clear, precise language how innumerable barriers were overcome by innovative and insightfully brilliant individuals to create a device that revolutionized our lives. I've always been fascinated by electronics, built my own radios and earned an amateur radio license in 7th grade, just because the subject and theory of how electrons move around to perform useful functions is intriguing. Reid has captured much of that fascination and translated it into a great story.

Before integrated circuits could be produced, the transistor had to be invented. Before that time, switching mechanism, required a vacuum tube to control, amplify and switch the flow of electrons through a circuit. It was the discovery that some semiconductor materials could be doped to have an excess of positive charges or negative charges that provided the breakthrough. A strip of germanium could be doped at each end with differing charges leaving a junction in the middle. The junction worked like a turnstile that could control the flow of current when connected to a battery. Variations in current across these junctions connected in the transistor formation could rectify (prevent current from flowing in both directions) and amplify. That's all that's needed to make a radio (I'm oversimplifying obviously) and hundreds of other devices. Transistors required vastly less current than vacuum tubes, were almost infinitely stable, were cheap and gave off little heat.

But, transistors required thousands of connections to the wires coming in order to make a useful circuit, and as demands for more complex circuitry arose the wiring became infinitely complex. This interconnection problem became a huge barrier that could have prevented the effective utilization of the advantages of the transistor

"You read everything. . . You accumulate all this trivia, and you hope that someday maybe a millionth of it will be useful," remembers Jack Kilby, one of the inventors of the integrated circuit. He also insists that he is not a scientist but an engineer. "A scientist is motivated by knowledge; he basically wants to explain something. An engineer's drive is to solve problems, to make something work. . . . Reid has elegantly interwoven the biographies of Jack Kilby and Robert Noyce. One of the delights of the book was learning how the two inventors thought, how they proceeded, and why they went in the directions they did.

Robert Noyce, founder of Intel, had developed a process to make transistors in arrays on a silicon wafer. They cut apart the transistors and then hired "thousands of women with tweezers to pick them up and try to wire them together. It just seemed so stupid." He, too, realized the tyranny of interconnection numbers. What they both came up with was the "Monolithic Idea." The notion that an entire circuit could be designed and produced on those silicon chips.

Obviously, there is little suspense in the story, but Reid captures and holds our attention. Both men accomplished the same feat at about the same time, approaching it from different directions. Kilby showed how the transistors could be placed on a single wafer and Noyce showed how the chips and circuits could be manufactured. Every transistor radio used the patent Kilby was awarded for his work. In so doing, he turned the future that Orwell had predicted in 1984 on its head. Instead of a monolithic centralization of power in the hands of a few computer elite who controlled all the computing power, "the mass distribution of microelectronics had spawned a massive decentralization of computing power. In the real 1984, millions of ordinary people could match the governmental or corporate computer bit for bit. In the real 1984, the stereotypical computer user had become a Little Brother seated at the keyboard to write his seventh-grade science report."

The social impact was enormous. Slide rules that had been ubiquitous were completely eliminated in just a few years by the handheld calculator that has become so cheap it is often given away in promotions. The Japanese gained virtual control over the memory chip industry because of the way they handled their work force. Americans had a monopoly until the 1973 recession. American companies typically lay off workers to save money during downturns. The Japanese try to keep their work force employed. This meant that when the demand for chips exploded, Americans did not have the capacity to produce enough to meet the demand. The Japanese, having trained workers available, met that demand and were able to produce enough at such a volume to keep the price so low as to inhibit any competition. That and their emphasis on high quality gained them 42% of the world market by 1980. The "Anderson Bombshell" report of 1980 (Anderson was a manager at Hewlett-Packard) that showed that Japanese chips were far more reliable than those made in the United States helped seal their market share.

It took winning the Nobel Prize for Noyce and Kilby to be recognized in the United States (Japan, a nation that honors its engineers, had awarded Noyce and Kilby numerous accolades over the years.) The final irony remains that in "our media-soaked society, with its insatiable appetite for important, or at least interesting, personalities, has somehow managed to overlook a pair of genuine national heroes- two Americans who had a good idea that has improved the daily lot of the world."
Profile Image for Thomas Dietert.
27 reviews8 followers
April 12, 2020
From start to finish, "The Chip" was a markedly insightful, thorough, broad-sweeping, and satisfying account of the inception of the micro-processor revolution . The author-- T.R. Reid, a journalist and technical writer-- provides an intriguing account of the seminal steps taken by the technology industry of the mid-20th century that brought humanity from relying on vacuum tubes as the core component of electronics (radios, computers, etc.) to semi-conductor "chips" comprised of billions of transistors that govern our daily habits and way of life far more than we might realize. The book's primary focus is on the solid-state physics and engineering advancements that predicated the solution to the Tyranny of Numbers problem faced by engineers trying to build more complex circuits out of the revolutionary new electronic component: the transistor; two engineers, Jack Kilby and Robert Noyce, astoundingly conceived of a solution to the problem within 6 months of one another, approaching the problem from two different perspectives and almost simultaneously settling on "The Monolithic Idea".

All throughout, Reid does not hesitate to provide the reader with sufficient descriptions of the underlying technologies at each stage of humanity's technological progression towards microchips as they exist today. Perhaps not for the technology averse reader, this book is a wonderful chronicle of potentially the most important series of scientific and engineering advancements of the 20th century-- Advancements that led to the ubiquity of such devices as the smart phone, the personal computer, and all of micro-electronics that now permeate the fabric of society in the 21st century.

It all started with the vacuum tube, and De Forest's insight into the fact that the "Edison Effect" (i.e. thermionic emission- the transmission of electric current through a vacuum, from filament to conductor) could be intricately leveraged such that an alternating current through a filament could induce a direct current in a nearby metal plate; With the insertion of a positively or negatively charged "grid" (or "screen") placed between the filament and the metal plate, De Forest showed that small variances in the charge applied to the grid could produce large variances in the current induced in the metal plate. Thus, the first vacuum tube amplifier was born, laying the foundations for radio sets and televisions that proliferated through American living rooms during the early and mid 20th century. Perhaps an equally important discovery was the fact that the current induced in the metal plate could be "switched" on or off thousands of times per second; this property of vacuum tubes provided the foundation for building the world's first computers (such as the ENIAC). Albeit state of the art at the time, the rather crude technology had many drawbacks: The large, power hungry, vacuum tubes had filaments that frequently burned out, and often generated a large amount of residual heat, making maintenance of contraptions based on the technology quite unwieldy.

Luckily for the world, William Shockley invented the transistor in 1947 at Bell Labs, using the semiconductor silicon-dioxide, to provide such an electronic switching and amplifying device that improved upon the vacuum tube in virtually every way: it generated orders of magnitude less heat, provided the same current amplification, could switch on and off much more quickly (billions of time per second!), and was the size of a pencil eraser (orders of magnitude smaller than existing vacuum tubes!). Subsequently, construction of far more complex circuits was enabled, and within a few years after the invention of the transistor, the industry had almost all but abandoned the vacuum tubes that were so recently the foundation of electronics at the time. It seemed like the industry had found its way out of the technological rut of the time, and for the next 5 years or so made quick strides in the miniaturization of most existing electronic circuit components; However, as much as the transistor served as the critical component of these smaller, more energy efficient electronic circuits, there was another problem electronics engineers would come to face just several years following the ubiquity of the semiconductor transistor. Pleasingly, the author goes into sufficient detail regarding the solid-state physics that begot such a silicon transistor; I greatly appreciated the explanations of the utility of semi-conductors in conjunction with the Boron/Arsenic doping process to allow for such current switching and electronic signal amplification.

As engineers attempted to build the complex circuits they could conceive, they quickly ran into the seemingly insurmountable problem of manually wiring together the electronic components (transistors, resistors, capacitors, etc.) of their increasingly complex circuit designs. Although the transistor theoretically permitted more complex and reliable circuit construction, the way in which the components had to be tediously wired together (by hand!) brought the industry advancements that fell out of the invention of the transistor to a stand-still. In the late 1950's, a solution to the Tyranny of Numbers was the prime directive of many great minds (and electronics companies) at the time, and the one to find such a solution would reap the benefits of a several-years head start on the production of circuit with the degree of complexity which the industry desired. As it turned out, two engineers, Jack Kilby at Texas Instruments (TI), and Robert Noyce of Fairchild Semiconductor, almost simultaneously happened upon an ingenious solution (in fact, the same solution) to the numerical tyranny that plagued the industry: The Monolithic Idea.

In 1958, both Jack Kilby and Robert Noyce conceived of a solution to the Tyranny of Numbers: if transistors could be made of silicon, then why couldn't other circuit components? Kilby and Noyce both had the idea that resistors, capacitors, and other critical electronic circuit components could be fabricated on silicon wafers too. Without going into the solid-state physics that made such an idea possible, both engineers conceived of such a solution from two different perspectives: Kilby concocted his solution in an ad-hoc, piecemeal fashion, imprecisely (but functionally!) building a single silicon circuit into a single slab of silicon with "flying wires" coming out at all angles. Conversely, Noyce postulated from the perspective of manufacturing tons of transistors on a single silicon wafer (the silicon transistor "fab" process): If a thin layer of silicon-dioxide was lain across the top of the N/P doped silicon "three-layer cake", one could insert wires through the top silicon layer to connect transistors composed of the underlying three layers to other transistors in the same wafer; Furthermore, instead of using external wires as connections, one could print thin copper connections between the components atop the top silicon-dioxide layer as part of the manufacturing process! Though both engineers had the insight to construct other crucial circuit components out of silicon dioxide, Kilby beat Noyce to the punch by about 6 months; However, Noyce conceived of the variant of the idea which necessarily integrated seamlessly into the existing manufacturing process of silicon wafers and the transistor manufacturing process. Thus, the following set of chapters that cover the resulting legal battle over patent awards between TI and Fairchild, speedy advancements in the production of micro-electronics, the influence such a discovery had on the notion of personal computing, and intriguing stories such as Kilby's invention of the pocket-calculator at TI (in the mid 1960's) are all deeply intriguing and captivating.

In the latter part of the book, the author takes liberty to diverge from the purely technological account of history and begins to frame the microprocessor revolution within the context of broader societal issues. For instance, one of the core reasons why micro-electronics development and production took off as quickly as it did was due to the US government's interest in putting a man on the moon; As a result, the US government poured hundreds of millions of dollars into the technology companies that were leading the charge on the silicon chip fabrication processes, and could thus be considered as one of the core reason why such rapid engineering progress was made. Arguably, without the USA's interest in shrinking the size of computers such that one could be launched into orbit and with the computational resources (increasing linearly w/ the shrinking size of the transistor) to guide such a mission, the pervasiveness of micro-electronics could be decades behind where it is today. Reid also goes on a brief tangent into the international competition of the electronics markets, focusing mostly on the effective production and market tactics of Japanese electronics manufacturers and the engineering difficulties inherent in the production of micro-electronics. Though this topic was not the reason I picked up the book, these chapter covering Japan's repeated successful entry into the international micro-electronics markets (starting with radio and television) and seemingly inevitable dominance captured my attention as well as the rest of the book; I found that Reid's coupling of the aforementioned advancements in microchip engineering and manufacturing with characterizations of the international political and economic dynamics at the time allowed me to conceive of such a technological and cultural revolution more clearly.

In my opinion, "The Chip" is a remarkable account of a series of several seminal inventions and advancements in electronics, the star of which being the silicon microchip. With digressions into other foundational concepts that paved the way for the micro-electronics revolution-- such as Shannon's insight into using Boolean Logic to construct circuits that execute computations-- Reid presents an eloquently candid and entertaining account of most relevant scientific and engineering discoveries with sufficient technical depth, keeping the reader stimulated and engaged. Complete with comprehensive explanations of the underlying physics, engineering, computational, and manufacturing concepts and processes that are necessary to viscerally grok the magnitude of the creativity of the ground-breaking discoveries that led to the advent of micro-electronics' penetration into modern society, Reid characterizes the very life-blood of our contemporary society: That is, there is virtually no aspect of modern society that the micro-processor revolution hasn't significantly altered and/or become the foundation of, since it's inception. If there's a book out there that presents the incredible story of the microprocessor revolution of the mid 20th century in such breadth and depth without sacrificing readability, I'd be astounded.
Profile Image for Michael Burnam-Fink.
1,546 reviews249 followers
January 9, 2022
The Chip is a humanistic look at one of the key inventions of the 20th century, the microchip which undergirds every digital change to our life. Thanks to chips, "just put a computer in it" has been a solution to almost every engineering problem, and the cause of a similar number of engineering problems.

In the 1950s, the electronics industry was carrying a blade with no handle. The silicon transistor had opened up vast possibilities by replacing large, power-hungry, and unreliable vacuum tubes. But the new solid state circuits were still built the same way, by wiring together discrete components like resistors, capacitors, and transistors, and the labor cost of hand wiring all these components was stalling future growth. Worse, as the complexity of circuits increased, their reliability went way down, a fatal flaw for aerospace and military applications.

Kilby at Texas Instruments and Noyce at Fairchild Semiconductor hit on the key idea at roughly the same time. If you could lay down resistors, capacitors, and wires inside silicon, you could make a circuit as a monolithic unit. Kilby was first by several months, but Noyce figured out how to get the leads between the chip and world laid down, which is a very important step. Doing everything in silicon is counter-intuitive, by raw materials it's comparable to building a boxcar out of solid gold, but the advantage in not having to wire together components is incredible. Cue the digital revolution that we know, though from the perspective of decades on the revolution was slower than we remember. The first few years of production went entirely to the military. The consumer product which blew the world open was the pocket calculator, which came out in 1971, 15 years after the invention of the chip.

Reid follows the rise of Japanese firms in high tech, as well as the divergent careers of Noyce and Kilby. Noyce went on to become the patriarch of Silicon Valley and a billionaire investor. Kilby kept inventing, though never with the same success. He was finally awarded the Nobel Prize in 2000, but neither of the two are household names despite their impact as inventors.

Reid also makes some odd choices in the technical explanations. There's a lot on Boolean algebra and binary logic, which is key to how chips work, and precisely nothing on photolithography, which is key to how they're made. This is an older book, which is beneficial because there's nothing like interviews with your subjects to get the right feeling across, and Noyce and Kilby are no longer available for interviews.
Profile Image for Christopher Litsinger.
747 reviews8 followers
March 7, 2013
Can you name the inventors of the microprocessor? I couldn't, in spite of the fact that I have a career that wouldn't even exist without the invention. So because of that, I'm glad I read this book, which focuses on the inventors (Jack Kilby and Robert Noyce fwiw).
However the book is frustrating in a lot of ways. It is neither a biography of the two inventors, or a technical text, but sort of attempts to do both. There's a chapter explaining how microprocessors work at a fairly technical level- a chapter that is probably tedious for anyone with a basic understanding of this (it was for me) and completely useless for someone who isn't grounded in the concepts. If you really want that, check out Code The Hidden Language of Computer Hardware and Software. There's a chapter that talks about how Japanese manufacturing was able to supplant US manufacturing in other areas.
As you might sense, this is not a terribly focused book.
Here's a passage I did enjoy quite a bit:
In a sense, this distinction between basic and directed research encompasses the difference between science and engineering. Scientists, on the whole, are driven by the thirst for knowledge; their motivation, as the Nobel laureate Richard Feynman put it, is “the joy of finding things out.” Engineers, in contrast, are solution-driven. Their joy is making things work.
191 reviews7 followers
September 21, 2023
** A history of the invention of the integrated circuit by Jack Kilby and Robert Noyce, and the resulting rise of the US Semiconductor industry. **

Flashback to 2001, Austin: Shortly after Jack Kilby won the Nobel Prize in Physics, I saw that Kilby was giving a speech in town in Austin. As I was working at the Motorola Semiconductor division at the time, I was particularly excited, though none of my co-workers knew or cared who Kilby was. I arrived at the auditorium a bit early, and went and took a seat behind a couple of old ladies.

A moment later an amazingly tall, old, bald man walked along the row in front of me and started chatting with the ladies, whom he obviously knew. Not hard to recognize that the man was Jack Kilby himself, and here he was 4 feet away from me! I couldn't believe it. I also had no idea he was so tall -- since he looked in photos like my high school math teacher, I had conflated Kilby with my wiry little old teacher. Yet here he was, and he looked every bit of 7 feet tall. When I looked it up later, Kilby was described as 6-6. I still say 7-0 if he was an inch. But maybe that's just because I knew he was a giant among men.

That was my brush with greatness, because Jack Kilby invented the integrated circuit (microchip); he invented the industry that employed my coworkers who didn't know or care who he was. It's uncommon for someone to win a Nobel Prize in Physics for INVENTING some tangible object -- but Kilby did. This book recounts that story of invention. There is a lot of news today about the chip industry, the chip shortage, the Chips Act, how the US and other countries have enacted laws and multibillion-dollar incentive programs to encourage the buildout of more chip manufacturing plants so as to not be so dependent on Taiwanese production. All of it, all of it, starts with Jack Kilby inventing the chip.

Or I should say, co-inventing it, for that too is part of the tale.

The book opens by painting the picture of a growing problem back in the 1950s -- the "tyranny of numbers". Those numbers referred to electronic components needed for computing. There was a growing need for ever more powerful computers. More computing power required more components. The vacuum tubes driving World War II computers were being replaced with the transistor. But even that was still a discrete piece you could hold in your hand, and needed to be wired up to other components to do anything useful. The invention of that transistor at AT&T in the late 40s did spark a revolution in dreams. A computer designer of the 1950s could draft circuit designs involving thousands of components for really space age devices. Problem is, such devices were impractical to make when thousands of components had to be hand-wired together in a cabinet. It was like Babbage's computer all over again. (Sneak peak -- the most modern microprocessors contain 100 billion components).

Enter Kilby, 1958. As a new hire at Texas Instruments in Dallas, Kilby struck on the idea of the "Monolithic integrated circuit" -- or IC. The chip. He built a prototype chip containing transistors, resistors, capacitors and other components in a circuit on a single piece of germanium. He was the first, in mid-1958, to solve the "integration" part of the puzzle -- lots of components could be built in place near each other in a proper design on single piece of material. (I've seen the very chip on exhibit at the Bob Bullock Texas State History Museum in Austin.)

But there was a problem with connecting those on-chip components together to make a functioning circuit -- this was the "interconnection" problem. Kilby's chip was jury-rigged, with the components connected carefully by wire, by hand. This was not something yet that could be mass-produced. Working on this slowed TI's progress as they got their patent process started.

While the outside world knew nothing of what was afoot at TI, Robert Noyce of Fairchild Semiconductor in Silicon Valley struck on the monolithic idea also, and designed in his lab notebook his own IC. This occurred 6 months after Kilby made his prototype, but was an independent invention. And with feedback from coworkers, Noyce attacked the interconnect problem and so also made notes about how conducting metal could be deposited on the chip in the same way components were. This was the key to mass-production.

Kilby was first with the integration solution; Noyce came later with that same idea but also solved the interconnection problem.

As the IC industry took off in the 1960s following these innovations -- Fairchild with the first commercial chip, then TI -- one might expect Noyce and Kilby to have entered a bitter lifelong rivalry for credit and glory, as happened with Newton and Leibniz and a thousand other cases of disputed priority. There WAS a corporate patent dispute between Noyce's Fairchild Semiconductor and Texas Instruments. No one disputed that Kilby was first chronologically; but Fairchild argued that the interconnection solution was so important as to be an essential part of the invention, and it came from Noyce.

This dragged on for years, because at first it seemed the winner stood to control a booming new IC industry. But while the lawyers and courts dragged on, the real world moved on more cooperatively. With regard to the industry, Fairchild, TI and other leading electronics firms agreed to cross-licensing deals regardless of the final outcome of the patent dispute -- this was too important to leave to the lawyers and courts. As for the individuals, Kilby and Noyce upset all expectations by being thoroughly decent guys, quickly accepting their status as co-inventors of the chip, and being gracious and generous in giving credit to each other to the end of their days. The technology world followed their lead and so they have universally been regarded as co-inventors of the chip.

The IC-invention arc covers the first half of the book. The remainder covers subsequent events and technical primers.

Noyce left Fairchild in 1968 to become cofounder and first CEO of Intel -- maybe you've heard of it. The departure of Noyce and his California team from Fairchild is why you've heard of Intel but never heard of Fairchild. Intel created the market for memory chips, invented the microprocessor, and became the dominant microprocessor supplier to the personal computer market. Meanwhile, the first consumer mass-market for ICs arrived with the handheld calculator -- invented at Texas Instruments by a certain six-and-a-half foot tall engineer who wasn't done inventing, along with some colleagues. THIS Kilby patent was not contested.

Kilby & Noyce shared various awards until Noyce died in 1990 -- which is the only reason he did not share the Nobel with Kilby in 2000, as it's not awarded posthumously.

The original edition I read in the 90s had been written in 1985, when both Kilby and Noyce were still alive. This was an updated version from 2001 after Kilby had won the Nobel. The 2001 edition offers a key update. The 1985 book chronicled the rise of competition from Japan -- thanks in large part to their famous post-war embrace of the teachings of American W. Edwards Deming promoting manufacturing quality (a whole 'nother entire issue I was obsessed with in the 1990s). The 1985 edition left off with things looking dire for the US industry. The updated edition however tells of the US resurgence to dominance going into the 2000s. This resurgence was thanks in large part to the final phase of Noyce's career, as he became the CEO of Sematech, a consortium of US chipmakers dedicated to advancing the US industry.

And to bring this review full circle, Sematech was based in Austin -- where I myself went to work for a chipmaker a couple years after first reading this book, where Noyce died one Sunday morning of a heart attack in 1990, and where I saw Kilby speak.
Profile Image for Edvinas Litvinas.
10 reviews8 followers
July 18, 2022
This is the book I was looking for. The semiconductor industry was a black box for me. Now at least I get what is going on. I would have preferred technicalities instead of human drama in some parts of the book though.
Profile Image for Harry Harman.
726 reviews14 followers
December 12, 2021
a careful, deliberate way of thinking.

exuded the easy selfassurance of a jet pilot, Noyce had an unbounded curiosity that led him, at one time or another, to take up hobbies ranging from madrigal singing to flying seaplanes. His doctorate was in physics, and his technical specialty was photolithography, an exotic process for printing circuit boards that required state-of-the-art knowledge of photography, chemistry, and circuit design. Like Jack Kilby, Noyce preferred to direct his powerful intelligence at specific problems that needed solving

Unlike the quiet, introverted Kilby, who does his best work alone, thinking carefully through a problem, Noyce was an outgoing, loquacious, impulsive inventor who needed somebody to listen to his ideas and point out the ones that couldn’t possibly work. That winter, Noyce’s main sounding board was his friend Gordon Moore, a thoughtful, cautious physical chemist who was another cofounder of Fairchild Semiconductor.

tubes kept burning out in the middle of its computations.

The warmth and the soft glow of the tubes also attracted moths, which would fly through ENIAC’s innards and cause short circuits. Ever since, the process of fixing computer problems has been known as debugging.

The transistor, in contrast, was a breakthrough that ordinary people could use. The transistorized portable radio, introduced just in time for Christmas 1954, almost instantly became the most popular new product in retail history. ($49.95)

There are certain standard components—nouns, verbs, adjectives in a sentence; resistors, capacitors, diodes, and transistors in a circuit—each with its own function

There are certain standard components—nouns, verbs, adjectives in a sentence; resistors, capacitors, diodes, and transistors in a circuit—each with its own function. A resistor is a nozzle that restricts the flow of electricity, giving the circuit designer precise control of the current flow at any point. The volume control on a TV set is really a resistance control. Adjusting the volume adjusts a resistor; the nozzle tightens, restricting the flow of current to the speaker and thus reducing the sound level. A capacitor is a sponge that absorbs electrical energy and releases it, gradually or all at once, as needed. A capacitor inside a camera soaks up power from a small battery and then dumps it out in a sudden burst forceful enough to fire the flashbulb. If you have to wait until the indicator light on your camera blinks to tell you that the flash is ready to use, you’re really waiting for the capacitor inside to soak up enough energy to make the thing flash. A diode is a dam that blocks current under some conditions and opens it to let electricity flow when the conditions change. An electric eye is a beam of light focused on a diode. A burglar who steps through the beam blocks the light to the diode, opening the dam to let current flow through to a noisy alarm. A transistor is a faucet. It can turn current flow on and off—and thus send digital signals pouring through the circuitry of a computer—or turn up the flow to amplify the sound coming from a radio.

‘the tyranny of numbers.’

On the assembly lines, the women who soldered circuits together—it was almost entirely women’s work, because male hands were considered too big, too clumsy, and too expensive for such intricate and time-consuming tasks—now had to pick up miniature components and minute lengths of wire with tweezers and join them under a magnifying glass with a soldering tool the size of a toothpick.

To enhance reliability, the designers tried redundancy. like a car built with two front axles just in case one should snap in half on the road.

A kid playing Super Zaxxon in the arcade needs to destroy an enemy base; to do it, he pushes the “Fire” button. The machine has to work through dozens of separate yes-or-no steps just to figure out that the button was pushed. At a billion times per second—completing one step of the problem every nanosecond—they become the foundation of a revolution that has swept the world.

The wires in an electric circuit tend to slow things down. The transistors in a computer switch on and off in response to electronic signals. A pulse of electricity moving through a wire reaches the transistor, and the transistor switches on; another pulse comes along, and the transistor switches off. No matter how quickly the transistor itself can switch, it cannot do so until the pulse arrives telling it what to do.

To increase computing speed, it was necessary to reduce the distance the messenger pulses had to travel— that is, to make the circuits smaller. But smaller circuits meant decreased capacity. The result was a paradox.

Some of the most crucial inventions and discoveries of the modern world have come about through basic research—that is, work that was not directed toward any particular use. Albert Einstein’s picture of the universe, Alexander Fleming’s discovery of penicillin, Niels Bohr’s blueprint of the atomic nucleus, the Watson-Crick “double helix” model of DNA—all these have had enormous practical implications, but they all came out of basic research. There are just as many basic tools of modern life—the electric light, the telephone, vitamin pills, the Internet—that resulted from a clearly focused effort to solve a particular problem. In a sense, this distinction between basic and directed research encompasses the difference between science and engineering. Scientists, on the whole, are driven by the thirst for knowledge; their motivation, as the Nobel laureate Richard Feynman put it, is “the joy of finding things out.” Engineers, in contrast, are solution driven. Their joy is making things work.

“Integrated circuits are the crude oil of the eighties.”

Among the latter is a humorous, or perhaps quasi-humorous, principle sometimes referred to as “the law of the most famous.” Briefly put, this natural law holds that whenever a group of investigators makes an important discovery, the most famous member of the group will get all the credit.

The work at Menlo Park led, fourteen years later, to the experiment known as “the zero hour of modern physics”—the discovery of the electron— and from there, along a more or less straight line, to wireless telegraphy, radio, television, and the first generation of digital computers.

“Well, I’m not a scientist,” the Wizard of Menlo Park said. “I measure everything I do by the size of the silver dollar. If it don’t come up to that standard then I know it’s no good.”

to build a better life for his fellow man—and get rich in the process. It was an archetypal American picture. set out at the age of twelve to make his fortune. He sold snacks on the Detroit–Port Huron train. He started a newspaper called Paul Pry. By his thirty-fifth birthday, Edison was a millionaire, a leader of industry, and probably the best-known man on earth. When he announced early in 1878 that he might try to perfect an electric light, illuminating gas stocks plummeted on Wall Street.

Struggling to find an efficient filament for his incandescent light, Edison decided to try everything on earth until something worked. He made a filament from dried grass, but that went haywire. He tried aluminum, platinum, tungsten, tree bark, cat’s gut, horse’s hoof, man’s beard, and some 6,000 vegetable growths before finding a solution in a carbonized length of cotton thread.

The mystery of electricity had prompted a number of contradictory hypotheses. Early researchers had postulated that electricity was a fluid (which is why we still talk today of “current” and “flow”).

most fertile era in physics since Isaac Newton’s day

Already scientists had measured the mass of the smallest object in the universe—the hydrogen atom, weighing about .0000000000000000000000017 gram

He was British to the core. In his memoirs he notes with great pride that twenty-seven of his students (including his son) were elected to the Royal Academy; as an aside, he mentions that seven of them (including his son) also picked up Nobel Prizes. J.J.’s own Nobel Prize, in 1906, seems to have satisfied him less than the knighthood he received two years later. When he died, at eighty-four, in 1940, he was buried in Westminster Abbey near the grave of Isaac Newton.

Hard-working, highly disciplined, extremely demanding of himself and those around him, Fleming was determined that everything about his lectures should be perfect—he rehearsed with a stopwatch so that every word and gesture would come at the right second

By marking where the returning beam came from, and measuring how long its round trip had taken, the British defenders could tell their fighters where to intercept the enemy.

scientific work, one experimental and one theoretical

The P-N junction works like the turnstile you pass through when you enter the subway or a stadium: you can go through easily in one direction but not the other
Profile Image for LSEsBooks.
43 reviews1 follower
September 11, 2020
Ein hervorragendes Buch. Ich will mich weiter mit Mikroelektronik beschäftigen. Super interessantes Feld.
127 reviews2 followers
October 10, 2022
Loved it---the author makes difficult electrical engineering principles easy to understand.
Profile Image for Simon Eskildsen.
215 reviews1,081 followers
December 2, 2019
Okay, Shackley & Co gave us the magnificent transistor in 1947, but how did we get from there to the general-purpose, spreadsheet-wrangling CPUs we have today? You can think of a transistor as a hose with an electric clamp. The clamp prevents water from flowing through it only when electricity goes to the clamp. If you stop sending electricity to the clamp, the flow of water resumes.

With the electric-clamp, we can now build circuits. Imagine a hose with two clamps next to each other, A and B:

----A----B----

If we send current through A, but not B, then put water through the hose, then
we won't get any water out:

====A====B----

But if we put water through A _and_ B, we'll get water out the other side:

====A====B====

That's called a "AND"-gate: a signal only passes through if both the 'clamps' (transistors) have power. We can construct other simple gates like OR-gates and NAND-gates this way. One of the modern heroes of computing from Bell Labs, Claude Shannon, wrote a thesis showing how we can build general-purpose machines by turning any problem into a combination of boolean functions. That means, we can solve any problem by putting enough transistors together, because we can form those gates with transistors. It means you can multiply numbers, add them together, so on, and so forth with transistors.

This worked well to create computers for a while, but as transistors shrunk and we continued to want them to get faster, a bottleneck crept in: wiring all those transistors together without error became near-impossible past a certain number of transistors. That threshold was reached in the 1960s. The problem was called "The Tyranny of Numbers." The circuit-engineers couldn't fathom how we'd put more transistors into our machines without error.

However, around this time, Jacky Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductors) both independently came up with the idea of the integrated circuit: instead of wiring individual transistors together, put them on a silicon-oxide conductive surface. This meant no more tiny wiring. It meant much faster transistors, because they could get even smaller. The Tyranny of Numbers was a solved problem. Of course, since two people came up with it around the same time, legal hell ensued. That part is less interesting.

Robert Noyce, Goordon Moore, Andy Grove, and others left Fairchild Semiconducters where they had worked on integrated circuits and co-founded Intel. There, they made another magnificent leap into the world of microprocessors. Before this time, each customer wanted their own custom-designed chip. But now chips started becoming available that were general-purpose, like the Intel 8080.

And there you have it, stage for the Computer Age set!

Lovely books, especially in the heels of the one on Bell Labs.
Profile Image for Marcos.
22 reviews1 follower
February 10, 2012
The Chip, recounts a fascinating story of two relatively unknown men that changed the course of modern civilization... really. Although working for different companies, many miles apart, they simultaneously came up with the monolithic idea, a basic blueprint for the modern microchip. This concept overcame the last remaining limit in the advancement of processing power that was known as the tyranny of numbers.

The book also shows the importance of government support in new industries as the only way to overcome the chicken and egg problem. Libertarians who still don't seem to understand that even necessary investments may not be made by companies if they don't see a quick return, should tell me how much longer it would have take the microchip to become as common as apple pie, or if it had survived at all.
33 reviews20 followers
November 6, 2014
Awesome: a must read for anyone in the computer industry. Before software could eat the world, we needed the chip, and this book tells the story of the people who invented it.
Profile Image for Dave Cremins.
53 reviews
October 14, 2020
The definitive account of the micro-processor and how it has shaped almost all aspects of life. I'm sad that I have finished this book. What a read!!!
December 31, 2023
Hello there nerds!
Okok, yeah this book might be useful also for non-nerd people, who would like to know more about computers and their story. And it's a pretty damn good book for readers like that!

Pretty cool and eye-opening book for somebody that was not yet born in the times of transistor effect discovery and is just curious of how did the computers come to life.

This book on the one hand gives a nice introduction / reminder of physics and electronics, explains how integrated cirtuits were invented and why was the need for them, but nevertheless the author focuses mostly on the people that made the inventions and their everyday work.
The Schockley guy from the diode, Fairchild, Texas Instruments, Bob Noyce and Jack Kilby, Moore's law -- if those terms were for you just a "ring" in an ear, now after the lecture the people involved in the chip revolution will be a lot closer to you. What I additionally liked was to get properly introduced to the timeline of inventions, how people cooperated in new companies, how they were starting new companies and what can a patent war look like.

I wonder how was the situation in other countries -- this book mentions only USA and slightly Japan (refferring also to the quality of production). I would really like to read a similar story, but one which also connects to the space missions.

Because the Apollo 11 (allegedly xD) happened in 1969, when the first Bell's lab MOS transistors were found out in 1967, but TI's handheld calculator in 1972. So much shit happened one after another and were still happening (radio and TV).

And now it's everyday life, which is hard to be imagined without the technology.
Spoiler alert:
When book ends, author gives examples of asking regular people from Texas and Dallas, home of the guys that solved the tyranny of numbers problem -- Jack Kilby and Bob Noyce, who invented integrated circuits that are in the products that they use all the times. Seems like people didn't know. Also the engineers were not that pushy to become more famous. And here lies the question. Should the engineers be more eager to become more popular to increase the drive for the technology further? Is it even important, or just being a happy consumer is all to it?
17 reviews
July 10, 2023
If you are a normal person, this is a very boring book. I, however, loved this book. If you are interested in semiconductors and its history, this is a great book. If you are wanting to gift someone this book, think “are they really interested in semiconductors and the history of semiconductors,” and if the answer is anything short of “YES!!!” then don’t buy this book for them.

Once I started reading, I struggled to put The Chip down (it also helped that I had jury duty one day and this was the only thing I could read). Reid does a great job of intermixing the historical narrative alongside describing each of the personalities that contributed to the blossoming of the semiconductor industry. Unlike other books about history, The Chip feels more humanistic. The innovators are described in both positive and negative attributes with a brush of Reid’s humor that kept me engaged. (Although the humor at times is clearly male humor, I still loved the jokes.)

Aside from semiconductors, The Chip taught me many life lessons and the complex nature of innovation. If you want to learn something and be entertained at the same time (and you like semiconductors and history), then this is the book for you.
Profile Image for Anusha Datar.
228 reviews4 followers
July 8, 2023
This is a comprehensive summary of the history of the American semiconductor industry. Reid carefully explores the personal histories of key players, the evolution of technological progress, and the impacts that those players and their inventions have at large. I was impressed by Reid's ability to clearly explain complex electrical engineering and computer architecture concepts, and I appreciated that he covered personal histories and technological histories with a similar amount of rigor. I've read a few books about the history of semiconductor development in the US and often have found them to be lopsided in favor of focusing on drama between the individuals involved or just a play-by-play of technological advances, but I thought Reid hit a great middle ground.

This book definitely focuses heavily on Noyce and Kilby (as the title suggests), and the author speaks from some strong assumptions (unilaterally pretty pro-government and pro-technology) without defending them. While it did not really take away from my reading of this book, I do wish there had been a little more nuance. Either way, I enjoyed reading this and would recommend this pretty highly as a primer or history.
Profile Image for Aaron Sabin.
41 reviews1 follower
August 3, 2023
This is one of my favorite books I've ever read - it inspired me greatly and was a pleasure to read. Reid deftly and compellingly lays out the history of scientific and technological developments that led to the invention of the microchip ("the monolithic idea"), from the discovery of electrons to the invention of the transistor. The story is richly told with details on every contributing step.

The invention formula outlined in this book is the following: There is no specific personality type that breeds invention. Rather, the most important quality is technological optimism (the belief that any technological problem can be solved given the proper effort and approach) and willingness to look past obvious potential solutions and embrace the "nonobvious." We also learn that the original inventors of the technology may have as little clue as the rest of us about its potential to come down the cost curve. Noyce and Kilby never could have predicted that through relentless improvement, the microchip could become so densely crowded with transistors, so cheap, and so reliable. Even Gordon Moore who coined Moore's Law made his original comment off the cuff and facetiously.
95 reviews
February 17, 2022
Very good book about the history of the microprocessor. The author does a great job of explaining vacuum tubes, resistors, capacitors, logic gates, etc. and tracing their importance through to our everyday life. For example, vacuum tubes made TVs and large cabinet radios possible for Americans to own and have right in our living rooms. Then transistors made pocket radios possible. These were life-changing discoveries, and eventually they led to Jack Kilby and Robert Noyce inventing the monolithic, integrated circuit - basically you could take a number of individual electronic components and put them together on one teeny, tiny integrated circuit or chip.

Thanks goodness for smart, dedicated geniuses like Kilby and Noyce. They changed the course of history, and allowed me to have an amazing career in networking and software engineering.

This book should be required reading for aspiring students in high school or college.
Profile Image for William Yip.
324 reviews3 followers
June 30, 2023
The title was misleading; less than half of the book covered the inventors as the rest was devoted to the discoveries and inventions before the microchip, how computers operate, and competition from Japan. The author also repeated himself frequently. That said, the book gave a good history of the microchip and its effect on the world as well as details of the two inventors. It is amazing how the chip enables all of modern life. The author was correct that cutting-edge technology becomes outdated and obsolete as more technology is invented. The author despaired that Kilby and Noyce never became household names but, fortunately, engineers now are gaining the limelight with the advent of Elon Musk, Sam Altman, and other brilliant people.
212 reviews3 followers
February 5, 2020
Readable, incredibly informative fire the non-engineer

Reid's book is clear, understandable, but delves into electronics, in general, and tie integrated circuit, in particular. He begins with Jack Kilby and Bib Noyes and the monolithic idea, but then the story goes back in time covering all that lead to the microchip. I particularly liked that he focused on who invented rather than what was invented.

Reid treated his roux much like Kilby and Noyes: he took the problem and found a solution. Kilby and Noyes needed a solution to the tyranny of numbers. Reid needed to get behind the machine to the human creativity svc innovation. Both solutions are elegant.
Profile Image for Lee.
59 reviews
April 14, 2021
first half contains satisfying history and explanation the transistor and microchip, with some focus on the personalities and the problem-solving mindset of engineers that distinguishes them from scientists. as book zooms out in second half to later developments of microchip industry it is less interesting, more skim-worthy. it is curious that jack kirby is so well-known in japan, and it does seem like a credit to them as the celebrity of gates/bezos/musk etc is surely more to do with dazzle of their wealth than anything of substance. the book was originally published in 1984 but only occasionally shows its age: eg, tvs no longer have cathode ray tubes.
January 29, 2020
The Third Revolution

important technical events ably presented, that created much of what we call modern--calculators, cellphones, computers, and even the innards of auto engines. If you know generally about such technology, you'd really enjoy this book. If you don't already know a little about such relationships, you're probably too dumb to understand it and should find some simpler book with crayons.
Profile Image for Straker.
324 reviews5 followers
February 2, 2020
Call it 3.5 stars. The first two-thirds of the book, which covers the progression from vacuum tubes to transistors to microchips, is quite interesting. The last third however drifts into subjects like the inner workings of a pocket calculator, the invention of television, and the rise of Japan in the 1970s and 1980s, all of which are only vaguely related to the main topic and which feel very much like padding. There's a better book waiting to be written about this subject.
204 reviews1 follower
February 9, 2020
If this book’s title sparks any interest for you, it’ll likely exceed your expectations; it did mine. It covers essential technological evolutionary steps that are foundational to the chip’s creation, the creation of the chip and all that has followed, and especially the personalities and their trials, challenges, and successes. A thorough explanation of the essence of how computers operate is included. The book does a fine job of honoring Jack Kilby and Robert Noyce, the chips inventors.
Profile Image for Rick Norman.
5 reviews
November 8, 2020
I loved this book! Fascinating story about the dawn of computing in Texas and a sleepy little suburb of San Francisco and all the amazing stories and characters that eventually made Silicon Valley. T.R. Reid tells an epic story from vacuum tubes, the tyranny of numbers to silicon wafers and the space program in an easy laid back style and describes it in layman's terms so it never became too technical or tedious.
267 reviews
March 30, 2021
The first 40 pages are a turn off, there are too many facts and too much science, but the next 160 pages are a gem. The history of the chip and how it started was something I was not aware of. Sitting at the beach, reading this book was an absolute joy to learn about the microchip and how it came to revolutionize the world. I wish the book had an update, there has been so much that has happened since the 1980s, a bit outdated in a world that moves so fast.
34 reviews
November 4, 2021
Not enough history and too many fumbled attempts at explaining the functioning and technicalities of computers. For a good layman's treatment of this Charles Petzold's 'Code' is the gold standard.

Only read if you are specifically interested in what Noyce and Kilby were like, and are also curious about how the nascent chip industry functioned. If the book focused more on these two aspects, it would've been a 4 or a 5.
Profile Image for Dennis McClure.
Author 4 books18 followers
July 24, 2018
First of all, the subjects of this book--Kilby and Noyce--stood the world on its head. And very few people have ever heard of them. Reid did his best to change that. He couldn't, but the failure is ours, not his.

Second, Reid is a marvelous writer with a striking ability to render technical subjects understandable.

It's an old book. But you need to read it.
Profile Image for Terri Chapin.
67 reviews1 follower
January 26, 2020
This book fascinated me! After working more than 20 years in the semiconductor industry, T.R. Reid explained in a satisfactory way just how these components I’ve been making work! And who knew that I could understand Boolean logic! A little dated at this point, but the historical facets are nonetheless intact. I’d love to read an updated version!
Displaying 1 - 30 of 87 reviews

Can't find what you're looking for?

Get help and learn more about the design.