Jump to ratings and reviews
Rate this book

Feynman Lectures On Computation

Rate this book
When, in 1984–86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

320 pages, Paperback

First published January 1, 1996

Loading interface...
Loading interface...

About the author

Richard P. Feynman

269 books5,962 followers
Richard Phillips Feynman was an American physicist known for the path integral formulation of quantum mechanics, the theory of quantum electrodynamics and the physics of the superfluidity of supercooled liquid helium, as well as work in particle physics (he proposed the parton model). For his contributions to the development of quantum electrodynamics, Feynman was a joint recipient of the Nobel Prize in Physics in 1965, together with Julian Schwinger and Sin-Itiro Tomonaga. Feynman developed a widely used pictorial representation scheme for the mathematical expressions governing the behavior of subatomic particles, which later became known as Feynman diagrams. During his lifetime and after his death, Feynman became one of the most publicly known scientists in the world.

He assisted in the development of the atomic bomb and was a member of the panel that investigated the Space Shuttle Challenger disaster. In addition to his work in theoretical physics, Feynman has been credited with pioneering the field of quantum computing, and introducing the concept of nanotechnology (creation of devices at the molecular scale). He held the Richard Chace Tolman professorship in theoretical physics at Caltech.

-wikipedia

See Ричард Фейнман

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
206 (45%)
4 stars
169 (36%)
3 stars
73 (15%)
2 stars
5 (1%)
1 star
4 (<1%)
Displaying 1 - 24 of 24 reviews
Profile Image for DJ.
317 reviews246 followers
October 1, 2010
There's a reason Richard Feynman is the most famous physics lecturer of all time. No, it's not because he held his office hours in a strip club (though he did) or that he helped develop the atomic bomb (though he did) or that he openly abused drugs, attended nudist gatherings, and played the bongos (though he did). Surely these have contributed to his legend but, most importantly, RPF was a master of the analogy.

Warning: Impending Tangent on Science Education and Modeling
Science education lends itself very well to the analogy. Consider the following. It seems most intuitive that a student wanting to learn a topic should want the most straightforward and realistic explanation possible. The student might say, "Tell me exactly how it really works." Unfortunately for the student, that's not always the best approach. First of all, science does not tell us exactly how things work. Science gives us models that act similarly enough to what we're interested in that the models make useful and accurate predictions. And where do we get inspiration from these models? From our everyday experience and intuition! Fortunately for us, the universe seems to have a beautiful mathematical structure to it and many different systems in nature seem to follow roughly the same models. What does this mean for the student? It means that often the best explanation of a physical phenomenon will, instead of focusing solely on that phenomenon, touch on related yet more familiar phenomenon that follow similar dynamics. In other words, the student should say, "Tell me how something similar but more familiar to me works, then connect them." The ability to appropriately cite these related phenomenon is the mark of a truly great teacher and was a staple of RPF's lecture style.
Tangent Over

Undergraduate "computer science" education in the US has unfortunately come to mean "database manager training." The Feynman Lectures on Computation are the perfect flotation device for any disheartened, theory-loving, future mathematician or computer scientist drowning in the overwhelming sea of code that is the path to a B.S. in computer science.

Feynman begins with the question "Exactly what does a computer do?" and offers a wonderful analogy of simpleton file clerks shuttling papers back and forth. From there, he takes the reader on a tour through basic gates and operations, reversible computing, the theory of computation, "Mr. Turing's machines", computability and the halting problem, coding and information theory, thermodynamics, exotic forms of computation, the physics of transistors and other components, and the physical limits of computation. Though many books play it safe and treat only the most established theories and ideas, Feynman isn't afraid to pose current (circa 1983) research questions and his work-in-progress solutions. Feynman's primary interests are in exploring just how far we can push computers given the laws of physics: how fast can they go, what can or cannot be computed, and how much energy must we use?

Despite the quality of the lectures, this book's finest feature is the exercises. Feynman frequently preaches the "pleasure of discovery" and embeds his lectures with creative, fun, and instructive exercises. In fact, the most memorable lesson I drew from this book was that an hour of thinking and playing with an idea is often worth more than 24 of reading about it. To those who don't see the point of solving problems that were solved decades or centuries before by others, Feynman offers the following wonderful characterization of science (paraphrased):

The life of a young scientist is spent rederiving old results, gradually rediscovering more and more recent ideas, until one day, he discovers something that no one else has ever discovered before. They key point is that without all that practice on "old" problems, it's insanely difficult to develop the skill and confidence to work on "new" ones.

The major weakness of this book is that parts of it are quite dated. Feynman gave his lectures in the early 80s and even by the time this book was published in 1996, much of the hardware physics was already archaic. However, the parts of the book on theory (the bulk) are still quite relevant. Even the dated bits are quite useful to simply get the flavor of how the laws of physics can be exploited to do useful computations. Most importantly, however, dated or not, this book is just plain fun to read.
Profile Image for Robert.
823 reviews44 followers
April 11, 2017
There is much that is meritorious here: Feynman's distinctive voice comes through clearly. One gets an insight into both his teaching philosophy and his working methods. The book heavily reflects what Feynman thought was important, interesting and essential to know about the field and makes accessible some really unusual topics as well as some familiar ones (if one has ever done an entry level course on the subject). There is a 10p memoir of Feynman by the book's editor at the end, which contains some delightful anecdotes that are not recorded elsewhere in the Feynman canon.

Feynman's working method, which he encourages others to adopt, was to work out as much as he can on his own first and look up what others had done afterwards. He would find that usually he had come up with no original results but quite often would have reached the same conclusions by an alternative route. Occasionally he proved something that was not known before. This technique is fabulous if one has both a wide knowledge (in memory) of physics and maths and a great facility with both, too. For lesser mortals it's completely useless.

The book oscillated from fascinating (reversible computing, quantum computers) to excruciatingly dull (logic circuit design, chip fabrication, semi-conductor device theory) depending on my personal level of interest. Even Feynman can't make engineering interesting to me! But that's not his fault; if you're into these topics it'll be great. If you're not, it's for Feynman completists only.
December 28, 2023
"Feynman Lectures on Computation" is an enjoyable read, but Feynman's conversational style may not suit everyone.

The majority of the content is both engaging and insightful. However, Chapter 6 on quantum computers left me slightly underwhelmed. Perhaps discussing this topic in 1986, when the book was published, felt premature. Despite these reservations, the book overall offers a solid foundation in core computer science concepts.

Profile Image for Will.
Author 8 books33 followers
December 19, 2007
Somewhat of a mixed bag. The first half is very interesting, then kind of loses steam towards the end. It seems like some course lecture notes were somewhat quickly tossed together to make a book; this could have benefited from a more in-depth going over by Feynman to smooth out some rough edges. Overall worth reading for a unique physicist's view of computability, but don't expect it to be up to Feynman's usual standard of quality.
366 reviews29 followers
July 30, 2023
I found this book an odd mix of material I already knew, interesting material that was new to me, and material too advanced for me to follow. Overall I found it worthwhile because of the middle category. I particularly enjoyed the brief discussion of how logic gates might be implemented, the section on reversible computing, the description of how to implement a semiconductor out of doped silicon, and the discussion of how to fit all the wires into a circuit. I'm not sure who, if anyone to recommend this book to. I'm tempted to suggest both a physics and computer science background to be able to follow everything, but that's just the group least likely to learn anything. I guess I'll settle for recommending this to clones of myself.
Profile Image for Roberto Rigolin F Lopes.
363 reviews104 followers
September 9, 2017
Wild academic interests together with empowering teaching methodology. Emphasis on empowering! Feynman is heavily armed with wit and tease pushing you to enjoy yourself rediscovering things using your own means. This is the only way you can appreciate other people's ideas to really understand things and have fun learning. At the very beginning, he tells an amusing history about a strange sequence of numbers that he once “discovered”. Just to learn afterwards that a clever guy called Bernoulli had discovered it somewhere in the 17th century! But he kept going further rediscovering things from the 18th century and so on. Until someday he figured out something no one had ever thought about before. Then you can start calling yourself a computer scientist, he boldly concluded.
Profile Image for Nicholas Teague.
69 reviews15 followers
November 3, 2014
Brilliant overview of key issues for computation. Obviously some is dated - this book is based on lectures given in '84-86 - but much of the material appears to me (speaking well outside of my domain of expertise) that would still be relevant to modern models of computation. Just because pointing out trivial mathematical errors makes me feel like I have something worthwhile to add, publisher should note that Fig. 6.3 has a bit misstated as 1 instead of 0 :).
16 reviews
November 8, 2013
As a computer scientist I am continually amazed at the deep understanding Feynman had of computers at every level, even (especially!) the subatomic.
63 reviews3 followers
December 1, 2022
Feynman picks some of the most influential works that fall under the umbrella of limits of what we can and cannot do with computers, like turing machine or shannon's theorems and studies them in his characteristic attack-from-many-point-of-views. Incidentally what we can and cannot do with computers also align with in a more fundamental way what we can and cannot know, perhaps that's what got him interested in the field. How he learns by trying things out from small hints before looking up other's works is fascinating. I found the seeming bitter-sweet relationship between the main editor and feynman amusing :P

Some things I wanted to remember:

The analogy of a file clerk being able to perform simple operations fast but otherwise dumb is a potent thought experiment to understand how a computer works inside. That's the pivot. One can go up, down, or sideways from there.

The set of logic gates consisting of "and" and "not" is complete, so is the singleton with "nand" gate.

Like NAND, CCN is complete but unlike NAND it is also reversible

Cannot build 1-bit memory with xor due to oscillations from feedback, flip-flop to the rescue. Delayed feedback may cause flip-flop to work incorrectly, master-slave flip-flop to the rescue. Shift-register is a multi-bit memory store.

Consistency of parentheses cannot be verified using FSM. A Turing Machine that can move in one direction: either left or right, is essentially an FSM.

A universal turing machine imitates other turing machines, useful to argue that uncomputable functions exist.

Parity bit can detect single bit error. Hamming code is a generalized parity check with multiple parity bits, each parity bit narrows down where the error is in a divide and conquer manner. It can correct a single bit error. To detect 2-bit errors an overall parity bit is used.

Shannon's theorem confirms that it is possible to communicate messages with arbitrary accuracy therefore we can choose to correct upto n bits of errors such that n+1 bit errors are unlikely. The theorem also states the lower bound on the numer of parity bits required to get the probablity of errors down to say 10e-30 or any number we choose. Interesting that single bit error can be detected with just 1 parity bit, but to detect even 2-bit of errors the number of parity bits required becomes a function of the message length.

Looking at error correcting schemes in terms of message spaces is revealing. There are two spaces: smaller, original message space and bigger, extended message space in which each original message has a number of parity bits inserted. Each point in original message space maps to a sphere in the extended message space, the radius of the sphere is the given number of bits of error we want to correct. Now if the spheres in the extended message space do not overlap, it is possible to backtrack from an errored message to its original version.

Huffman coding like morse code is an example of non-uniform coding scheme. It saves space on average by assigning shorter codes to more frequent symbols. It is also uniquely decodable, therefore no code for a smbol appears as a prefix for the code of another symbol.
Profile Image for Anthony O'Connor.
Author 4 books25 followers
March 18, 2020
Superb

A superb introduction to the basics of the theory of computation by one of the greatest minds of the twentieth century. Right down to the basics. Explaining everything beautifully.
The chapters on reversible computation and the thermodynamics of computation have a bit more physics in them than you might be used to if coming from a purely comp sci background but it’s worth it.
Most fascinating if all is his last chapter on quantum computing. These were lectures from the early 80s. Feynman’s focus was on
How small and how efficient computers could become, right down to scales where they would necessarily be working quantum mechanically. His concern was ‘simply’ to design a machine that could do what a classical machine could do. He sketches the design of such a machine.
It must have been elsewhere that he (and others) were also beginning to suggest that the quantum computers could very ... very significantly outperform classical computers for some types of computations. The rest is history.
Profile Image for Skylar.
224 reviews2 followers
February 15, 2021
There were some parts of this book that were gripping (thermodynamics of computing, theory of computation and languages) but unfortunately many parts were just dull (symbolic logic, intricacies of logic gates), and other parts that are accessible to physicists interested in computer science but not vice versa.
Profile Image for Paige McLoughlin.
597 reviews32 followers
April 17, 2021
Feynman does a good job on Computation starting with ones and zeroes and logic gates and building up to theory of computing including universal Turing machines, Shannon communication and entropy, reversible computing, quantum computers, how silicon chips work. A good primer on computer technology which even though written in the 1980s is still relevant.
Profile Image for Samuel.
109 reviews
November 15, 2018
In the classic Feynman style, clearer and better than soome lecturers in the field!
32 reviews1 follower
March 21, 2020
Excellent !
Did not understand the chapter about Quantum Computing.
Excellent explanations of the most fundamental elements in computers.
Profile Image for Carter.
597 reviews
January 11, 2022
There is much nuance, in the simple presentation- other than that, I am not sure I can say too much. The additional chapters, perhaps the idea of an editor, add nothing.
Profile Image for Mark Moon.
150 reviews109 followers
November 25, 2015
A lot of this seems pretty dated. The chapter on coding and information theory was a decent introduction, and the quantum computation material was interesting from a historical point of view, since Feynman seems to have been the first person to seriously think about quantum computers. The stuff on gates and wires and the physics of semiconductors wasn't very interesting to me, personally.
Profile Image for Dax.
72 reviews1 follower
September 14, 2007
Feynman's mind leaps across multiple branches of science to bring powerful insights into computing. I love his approach and he delves from programming into the physics of the transistor in one fell swoop - and that's all in a single page!
6 reviews11 followers
February 19, 2009
Short pieces on a range of computer science topics with a truly insightful approach.
3 reviews
June 8, 2010
This is a great introduction to the electronic underpinnings of modern microcomputer systems.
Profile Image for Chris B. .
66 reviews5 followers
March 9, 2014
Not very usable and no big insight on computer science concepts.
1 review5 followers
November 29, 2016
I really like this book. It explains how computers work with helpful analogies. I recommend this book if you are interested in electronics, computers and computation.
Displaying 1 - 24 of 24 reviews

Can't find what you're looking for?

Get help and learn more about the design.