Jump to ratings and reviews
Rate this book

Future Babble: Why Expert Predictions Fail - and Why We Believe Them Anyway

Rate this book
In 2008, as the price of oil surged above $140 a barrel, experts said it would soon hit $200; a few months later it plunged to $30. In 1967, they said the USSR would have one of the fastest-growing economies in the year 2000; in 2000, the USSR did not exist. In 1911, it was pronounced that there would be no more wars in Europe; we all know how that turned out. Face it, experts are about as accurate as dart-throwing monkeys. And yet every day we ask them to predict the future — everything from the weather to the likelihood of a catastrophic terrorist attack. Future Babble is the first book to examine this phenomenon, showing why our brains yearn for certainty about the future, why we are attracted to those who predict it confidently, and why it’s so easy for us to ignore the trail of outrageously wrong forecasts.

In this fast-paced, example-packed, sometimes darkly hilarious book, journalist Dan Gardner shows how seminal research by UC Berkeley professor Philip Tetlock proved that pundits who are more famous are less accurate — and the average expert is no more accurate than a flipped coin. Gardner also draws on current research in cognitive psychology, political science, and behavioral economics to discover something quite The future is always uncertain, but the end is not always near.

320 pages, Hardcover

First published January 1, 2010

Loading interface...
Loading interface...

About the author

Dan Gardner

18 books83 followers
Dan Gardner is a journalist, author, and lecturer.

Trained in law (LL.B., Osgoode Hall Law School, class of ’92) and history (M.A., York University, ’95), Dan first worked as a policy advisor to the Premier of Ontario. In 1997, he joined the Ottawa Citizen. In the years that followed, he travelled widely, researching long features about drugs, criminal justice, torture and other challenging issues. Later, he was a national affairs columnist until he left the newspaper in 2014. Dan’s writing at the Citizen won, or was nominated for, most major prizes in Canadian journalism, including the National Newspaper Award, the Michener Award, the Canadian Association of Journalists award, the Amnesty International Canada Media Award for reporting on human rights. In January, 2015, Dan became editor of Policy Options, Canada’s premier magazine of public policy, published by the Montreal-based Institute for Research in Public Policy.

In 2005, Dan attended a lecture by renowned psychologist Paul Slovic. It was a life-changing encounter. Fascinated by Slovic’s work, Dan immersed himself in the scientific literature. The result was a seminal book on risk perception, Risk: The Science and Politics of Fear. Published in 11 countries and 7 languages, leading researchers, including Slovic, praised the book’s scientific accuracy and lucid analysis of how psychology and social processes interact — causing us to fear what we should not and not fear what we should.

Dan’s second book, Future Babble, delved more deeply into psychology to explain why people continue to put so much stock in expert predictions despite the repeated — and sometimes catastrophic — failure of efforts to forecast the future. Again, Dan was delighted that his book garnered the praise of leading researchers, including Harvard’s Steven Pinker, who said it should be “required reading for journalists, politicians, academics, and those who listen to them.”

Both books resulted in invitations to give talks worldwide. Dan has lectured for Google, Siemens, Zurich Insurance, Khosla Ventures, and many more corporations and governments.

Next up: Superforecasting, co-written with Wharton psychologist Philip Tetlock, which examines the remarkable results of Tetlock’s latest research program. Superforecasting will be published in September, 2015, by Crown in the United States, Random House in the United Kingdom, and McClelland and Stewart in Canada. There will also be German, Spanish, Russian, Chinese, Japanese, Korean, Portuguese, Dutch and Finnish editions.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
171 (23%)
4 stars
297 (41%)
3 stars
183 (25%)
2 stars
50 (7%)
1 star
13 (1%)
Displaying 1 - 30 of 103 reviews
Profile Image for Trevor.
1,337 reviews22.7k followers
July 31, 2012
No, had to stop before this one even got started. I need to do something to control my blood pressure, and listening to idiots being smartarses isn't it. If you need to find out the meaning of hindsight bias, well, this is as good a book as any other. The guy 'demolishes' predictions made by experts and thus shows the need to be skeptical - if not actually cynical. The problem is that he predictions he chooses to demolish are all a bit bizarre. There is the right wing prediction that there would be weapons of mass destruction in Iraq - except that wasn't a prediction, that was a lie. There is a big difference there. There was the prediction that the US would be welcomed as liberators once in Iraq, and who would have predicted that wouldn't come to pass - to which the answer is, oh yeah, just about everyone. But the failed prediction that made my blood boil was that crazy leftwing one that Reagan's push to grossly accelerate the arms race pushed the world to the brink of nuclear war. Only someone completely blinded by their ideological preferences could argue that the mass stockpiling of nuclear weapons actually makes the world a safer place. That we got through those years without being destroyed in an endless fireball and nuclear winter is down to good luck. It is utterly clear that Reagan was unconvinced by the science of the utter devastation of such a war and was doing all he could to push brinkmanship. We are lucky the Soviet Union chose to implode - that was not the most likely outcome, nor the one they were being pressed into taking by Reagan's policies. To present the very real concerns we had in the 1980s of near immediate species suicide as a failed prediction is to completely misrepresent both the role of reasonable prediction and to misrepresent the very real dangers such weapons still present. It was bad enough with Reagan had his hand on the button, but would you choose to have Putin's hand there today? Or Iran's? Israel'? North Korea's? Pakistan's? India's? Excuse me for not feeling out of the woods yet, or of feeling terribly certain of any prediction that nuclear war ending in species extinction isn't still on the cards. If you want a reason to think the problems facing us aren't really problems at all as experts are always wrong, here is the book for you. Personally, I would rather act on the basis of the best available evidence in ways most likely to preserve our precious and all too delicate existence here on earth - even if that does mean being proven too conservative. The consequences of being proven too confident are too frightening to contemplate.
Profile Image for Unwisely.
1,451 reviews15 followers
August 20, 2018
This was a good book and an engaging (but not too challenging) read. It is only suffering in my rating because I immediately followed it with Superforecasting: The Art and Science of Prediction, which was on the same topic (and Dan Gardner co-authored), which I thought was better.

Still, I learned things. Like that several countries (including South Korea) did nothing about Y2K. Hunh. This book gets a little more into the cognitive psychology, why we (as a society) still listen to people with a terrible track record - people prefer to be dazzled with BS, and confidence is more impressive than cautious precision. Or, as the intro to the Nine Inch Nails track puts it, "I don't want knowledge, I want certainty!"

Anyway, not bad, but I would recommend the other book.
Profile Image for Aras.
434 reviews3 followers
March 4, 2011
This was okay, but read about a third or more of the way through it, and it was basically repeating the same idea over and over, can't imagine it adds anything more somewhere near the end. Ironically, I think the author makes statements way outside his expertise when he starts talking about how everything can be explained by Darwinian evolution and what life was like for primitive man - which can be generalized to such a degree that it becomes about as accurate as basing your theories of humanity on the horoscope.
Profile Image for Cara.
779 reviews67 followers
June 4, 2014
I enjoyed in part because I hate the fact that people (everyone on cable news) make stupid predictions all the time and are never held to account for their massive BS. The first part of the book mostly discussed various major predictions and how they had failed, and this got boring pretty fast. The second part explored more why people make bad predictions and why they stick to them even when they're proved wrong (resolving cognitive dissonance), and this was much more interesting. Nothing groundbreaking here, but still worth a read if you're interested in that kind of stuff.
Profile Image for Josh Saleska.
160 reviews2 followers
March 30, 2019
Dan reels off an exhaustive list of failed expert predictions throughout history. Some are egregiously inaccurate. This book is a more approachable "Black Swan". He also gives us the names of the experts to help us embody the targets of our mockery.

Throughout the book, Dan praises foxes and denounces hedgehogs (terms used to describe two types of thinkers). Foxes are malleable, fluid. They are able to live in uncertainty. They know they can be wrong. Hedgehogs live by a unifying theory of everything. They are unable to see their mistakes even in hindsight. They make arguments that are clear, simple, and wrong.

One of the most interesting bits in this book is the discussion on our profound aversion to uncertainty. People feel worse when something bad might occur rather than knowing for certain it will occur. Uncertainty spoils our illusion of control.

Hindsight should self correct our reliance on predictions, but it never does. Experts forget or excuse past failed predictions with regularity for various reasons. As a people, we crave certainty and are drawn to the experts who are most confident in their future predictions. Ironically, the most popular figures in media are the loudest voiced hedgehogs. And for this reason, they are often the most incorrect. Our desire for certainty is what fuels the talking heads to take hard positions.

Be like the fox. Be fluid. Have humility. Don't be certain about your predictions. Know that you can be wrong.
Profile Image for Zac Scy.
54 reviews19 followers
June 26, 2016
After having read "Superforecasting" by Tetlock & Gardner I wanted to delve deeper.

While I found this a wortwhile read there was a lot of repetition, if you're looking for a supplement and some expansion on "foxes & hedgehogs" then I suggest you give it a go.

Otherwise, most of it is already covered in "Superforecasting".
Profile Image for Steven Kopp.
133 reviews8 followers
June 21, 2017
Good: Convincingly showed how "expert prediction" is often wrong, and why. Called a lot of people out on their bad predictions, and their bad rationalizations for why they got them wrong.
Bad: Once the schadenfreude wore off Gardner just felt like the same kind of pompous expert he was mocking.
Ugly: In some ways this book was an eye opener for just how self-defeating modern sociology can be. The basic premise was this: 1) The world is super complex. 2) Our brains didn't evolve to understand and make predictions in this environment. Thus, most people simply have insurmountable biases (even a bias bias) which help them cope. But if Gardner is right (and most modern sociology books for that matter) and our minds aren't suited to this level of abstract and rationale thought, why should we believe them? Why believe these experts but not the experts they mock? Do I only believe Gardner because he seems so confident? Etc. I'm not saying he's wrong, just that when you say that our minds aren't suited to this sort of task, you end up undermining your entire argument.
Profile Image for Carmen.
210 reviews28 followers
March 10, 2018
Interesting and funny, this book delves into the world of expert predictions and human folly. It makes for an introspective view of the people I listen to in regards to current and future world events, and how I probably judge them more for their confidence and charm than their accuracy. Hindsight, confidence bias, memory; it all plays a part in how successful each expert and their adherents believe they are. Best quote: "Not one of them mentions that if they were as accurate as they are confident, they would be billionaires - and billionaires don't do talk shows."
77 reviews
December 6, 2020
Offers important lessons to explain why we consistently fail at predictions, but continue to make them and listen to those who do. The most useful part of the book, alongside its dose of skepticism about anyone claiming to see the future, is its introduction of various heuristics we can use to overcome our in-built biases.
Profile Image for Jessica Scott.
Author 57 books1,271 followers
October 14, 2019
Interesting look at the problems with future forecasting and wrong predictions about the future. A bit heavy on the psychology angle, which doesn’t really apply at the aggregate in complex social systems.
Profile Image for Tony.
280 reviews1 follower
January 29, 2020
If you're interested in forecasting, there's not much useful advice here. If you're not, all it says is"don't trust forecasts", which is too reductive. Either way, skip this and read superforecasting.
35 reviews
January 9, 2021
Argumentovano i detaljno napisana - jos jedna potvrda da je predvidjanje tesko pogotovu kad se radi o predvidjanju buducnosti.
Mozda je poglavlje o uspesnim metodima u predikciji moglo da bude i malo duze ...
This entire review has been hidden because of spoilers.
Profile Image for Ramona Cantaragiu.
1,104 reviews19 followers
August 4, 2022
I've grown tired of books like this which could be easily summarized in a couple of pages. If you find the summary somewhere like Blinkist than go with that. Btw, there is nothing new being said here and it is also not being said in the form of a worthy story.
Profile Image for Miran.
289 reviews
July 18, 2023
Interesting book with too many anecdotes and trivia too showcase the point that future is unpredictable.
Among that there were few nuggets of knowledge that are really appreciated. Last chapter is the best.
160 reviews4 followers
February 20, 2017
Excellent - shows how all the predictions can be questioned :) and almost always are wrong
Profile Image for Caleb.
51 reviews1 follower
May 28, 2019
A fantastic discussion of biases that different types of experts are prone to, and how to avoid them.
Profile Image for Randy Mcdonald.
75 reviews13 followers
November 14, 2012
Dan Gardner's Why Expert Predictions Fail - and Why We Believe Them Anyway is one of those books that points out the obvious that needed explanation, pointing to an issue--here, the tendency of futurologists of all kinds to make predictions which turn out false but whose opinions and methods are still valued--and explaining why this tendency exists.

The central problem Gardner deals with is this. I like to know about what will happen in the future, you like to know what will happen, we all want to know. Will the Earth be deterraformed by climate change and other environmental catastrophes? What fashions will be in vogue in Paris and Moscow and New York City next year? Will nuclear war raze the Northern Hemisphere? Are the French really going to outnumber the Germans by 2050? When will we send a manned mission to Mars? Using ostensibly scientific frameworks, any number of smart people have created systems which aim to explain the future: Arnold Toynbee created a theory of civilizations that claimed to describe the past and predicted the creation of a totalitarian world-state, for instance, and Paul Ehrlich predicted mass famines in the 1970s. Neither prediction came to pass, and any number of other predictions by other people (smart or not) have also failed to come true. Why?

Chaos theory, Gardner points out, makes predictions which go too far out into the future impossible. As the Depeche Mode song goes, "everything counts in small amounts." I zig, here, and the next mayoral election in Toronto goes one way; I zag, there, I get hit by a car and never get elected ward councillor. Accounting for all the variables involved is impossible at the best of time, while the simplified theories used by these futurologists are even less capable. Certain predictions can be made in certain broad contexts--Gardner cites the knowledge that, based on births and migration this year, we know how many people will be 30 years old in 30 years time, and we can speculate on their marital behaviour and fertility regimes--but that's it. This is not a new fact.

Why do we believe the people who claim to know what will happen? Put it down to our primate brains. We just aren't as perfectly rational as we'd like to think we are, with tendencies to overlook inconvenient facts. Toynbee had to hack his schema to account for the fact that Islamic civilization began--not ended--with a universal empire, while Ehrlich kept postponing his doomsday, saying that it will come. How did these gentlemen get away with this? They had tremendous charisma, with the population at large if not with people with enough knowledge to critique their theories, with excellent presentation skills and good connections and the certainty that, in a confusing world full of threats, they knew what would happen. And they themselves believed that they'd know, again discounting inconvenient facts, indeed becoming upset if people pointed out their contradictions.

All this is a serious problem for people. Acting on the basis of mistaken theories could cause catastrophe: Ehrlich's suggestion that food-exporting countries stop exporting food to countries "doomed to fail" like Egypt and India would have created horrors where none happened. It is possible, Gardner emphasizes, to learn ways to think critically about the future, particularly by adopting the practice of radical doubt. George Soros did a good job predicting the world financial crisis, but in numerous interviews Soros has emphasized the fact that he looks not for proof that he's right, but rather for proof that he's wrong. (These critical thinking skills would be useful in domains apart from predicting the future, too, but that falls somewhat outside the scope of Future Babble.)

Engagingly written, very well-sourced, and well-argued, I'd recommend Future Babble for anyone who's interested in what we think about the future and how we can do better.
Profile Image for Betty.
547 reviews54 followers
November 19, 2010
One of the first things I've learned from this fascinating book is that the more you know, the less you know. You can not base the future on the past. The reason for this is that there are too many variables, the future is not linear. Too many things can cause hiccups in the reasoning. I learned that the economic and political experts who make forecasts for the future are rarely right, which leaves me to wonder if half of them predict something positive for the future, and the other half predict something negative, does that mean man would never progress? Nothing would ever happen? The section on randomness I thought was particularly interesting, I learned how the subconscious often overrides the conscious in making decision, but the conscious eventually gets there, it's just slower than the subconscious. In other words, the idea that is your first automatic thought is probably the right one, as in hunches or intuition. I've found that in my own life, if I am writing a letter, a story, or a comment to the newspaper, if I think about it after it's written, I start to overthink it and eventually go back to my original (if I haven't lost it, because that overthinking often messes up my clarity).

Foxes and Hedgehogs, Dan Gardner's names for the experts we place our faith on. The Foxes are the popular (right or wrong), confident, and probably entertaining; Hedgehogs, quieter, more careful and technical in their precision and declaration. Yet few predictions of the future based on the past and the present can come to pass. Hedged in wording that is not on a timeline or precise, but full of confidence and great presentation, the Foxes seem to do no wrong in their predictions, yet they are rarely right. When they are, because of their popularity, they are only remembered for their hits and their misses disappear in the human mindset. In Gardner's words, "Be simple, clear, and confident. Be extreme. Be a good storyteller. Think and talk like a hedgehog." The author has chosen a few experts from each type of thinking for his examples. In his findings, though, there appear to be too many examples of misses and too few hits, exactly his point. Surprisingly, they almost all think they were right. Somewhat like proofreading your own words, you see what you thought you wrote, they, too, see their own predictions the way they think they presented them.

What I enjoyed most about this book is the section on experiments performed to learn how people are influenced in their decisions of what or who to believe. A wide variety of these experiments will no doubt amaze the reader as to how the mind can be manipulated or simply change sides by what they perceive in the first examples. An unusual book for a mostly economic, environmental, and/or political evaluation of future predictions, but the second part of the title tells it all: Why Expert Predictions Fail - and Why We Believe Them Anyway. Based on no more information than what has been predicted by experts, I have decided, thanks to reading this book, I will no longer worry about the world in 2012.
Profile Image for Jonathan Lu.
325 reviews20 followers
July 31, 2013
This is a book that could have been summed up in a 35 paragraph op-ed, let alone an entire novel. The first half of the book is dedicated to the inaccuracy of past predictions about the future (in a highly �I told you so� tone), invoking examples of oil price prediction, superstition (Y2K, 2012), military might, and geopolitics. The 2nd half of the book delves into the human psyche of why we are so attracted to future prediction, rather than introspection. A few scientific studies are cited that sum up how humans are more apt towards bad news (when was the last time you heard a celebrated prediction about future progress? More likely you remember Cassandra and the doomsday soothsayers) and lend credence towards confidence vs. accuracy. As most politicians know well, the public is more apt to believe an air of certainty rather than boring accuracy, have a very short memory for those predictions that are incorrect, and a very long memory for those shots in the dark that come true. It�s true the dictum that a know shit is less stressful than an unknown paradise. Citing various examples of futurists who have been correct, the author dispels their skill as �try enough times and you�ll eventually get it right�, which is not too far off considering that even a stopped clock is right twice a day. Future predictions are incredibly complex and non-linear, that it becomes nearly impossible, yet those who realize the profit to be made by making such predictions are those who take advantage (usually in the media). After indicting basically all types of future prediction, the author leaves little alternative conclusion than the implied beauty of experiencing the unknown in life. While I agree with this message, I do not agree with his attacks upon George Friedman and STRATFOR predictions. While it is true that Friedman/STRATFOR�s accuracy has been dubious at best, I find that this reads into his publications in completely the wrong manner. Rather than taking Friedman�s predictions as future soothsaying, I take it more from the perspective of a historian explaining what mistakes we have made in the past and how if we continue a future scenario might unfold. Friedman�s warnings likely will not come true, but serve as a real-life history lesson that would be remiss to forget about. In summary, this is an interesting book and quick read for those looking to kill an afternoon. My takeaway was more of an �I told you so� to not believe all the Cassandras out there without any suggestion for what it is that we should believe
Profile Image for Alex Jones.
244 reviews11 followers
February 21, 2017
This was really interesting and I'm glad I read it, but did become a bit too repetitive. This was written before shock unpredicted results of Brexit and Trump, but has a wealth of examples of 'experts' failing to make accurate predictions and the reasons for the difficulty in doing so. The book was interesting and compellingly argued, but was lacking in some considerations of when prediction was possible and instead repeated a few examples many, many times. Certainly interesting, but could have been quite a bit shorter without a real loss of content.
Profile Image for Mat.
82 reviews31 followers
February 21, 2012
In a longitudinal test, "experts" were found to have the same chance at predicting the future as a dart-throwing chimp. This book explains why - and why we believe them.

Here are a few standout quotes from it:

Shocking
terrorist attack? Didn’t see it coming? Let’s imagine more
shocking terrorist attacks. Economic disaster? Big
surprise, wasn’t it? So let’s imagine more economic
disasters.

Tell clients what they
and all informed people believe to be true and they will be
pleased. We all enjoy having our beliefs confirmed, after all.
And it shows that we, too, are informed people. But dispute
that belief and the same psychology works against you. You
risk saying good-bye to your client and your reputation.
Following the herd is usually the safer bet.

“Too many of us now tend to worship selfindulgence
and consumption. Human identity is no longer
defined by what one does, but by what one owns. But we’ve
discovered that owning things and consuming things does
not satisfy our longing for meaning. We’ve learned that
piling up material goods cannot fill the emptiness of lives
which have no confidence or purpose.” - Jimmy Carter

If we do not perceive
ourselves to have at least a little control of our surroundings,
we suffer stress, disease, and early death. Control is such a fundamental psychological need that
doing without it can even be torture.

Knowing what will happen in the future is a form of
control, even if we cannot change what will happen. When all is uncertain, nothing is predictable, and that is
terrifying.

The alternative,
after all, is to reject expert prediction as a source of
certainty. And if you do that—and you can’t accept
superstition, religion, or conspiracy theories—what are you
left with? Nothing. And that’s frightening.

“Survival requires urgent attention to possible bad
outcomes,” the review noted, “but it is less urgent with
regard to good ones.” People whose brains gave priority to
bad news were much less likely to be eaten by lions or die
some other unpleasant and untimely death than those
whose brains did not, and so “negativity bias” became a
universal human trait.
Profile Image for Alina.
113 reviews
June 10, 2011
This fascinating, extremely valuable book looks at forecasts from economists, historians, social scientists, biologists, and sundry other "experts," mostly from the beginning of the 20th century onward. It explores the track record of these forecasters (laughably bad), how they react when their expectations prove wrong (they spin and rationalize), and why we keep asking them to predict the future regardless (we hate uncertainty).

The author points out that when it comes to complex, chaotic phenomena like human societies, the future is inherently uncertain. Furthermore, quirks of psychology affect the way we evaluate probability, leading us to irrational thinking. This is true even for scientists and academics. The book includes an often amusing tour of predictions that were once accepted by most "smart people," but proved wildly inaccurate (for example, the Population Bomb scenarios of mass famine in the 1970s or 1980s). Various studies show how even the most rigorously trained and knowledgeable people can be influenced by unconscious factors and ideology. For example, when scientists conduct peer reviews to determine whether an article should be accepted for a journal, they are supposed to look only at whether the methodology is good, but they are swayed by whether they like the conclusion. This does not surprise me, as scientists are only human.

A large-scale, long-term study of expert predictions concluded that experts could be divided into two groups -- foxes and hedgehogs -- based on their thinking style. Though nobody did all that well, the most accurate projections came from "foxes," people inclined to take many factors into account, accept uncertainty, and recognize their own limits. The worse came from "hedgehogs," whose thinking revolves around One Big Idea. The best-known forecasters tend to be hedgehogs, since people gravitate toward experts with simple, firm answers. Thus, ironically, the more famous the pundit, the more likely they are to be wrong!
Profile Image for rabbitprincess.
844 reviews
August 17, 2012
Dan Gardner is a columnist with the Ottawa Citizen and one of my favourites. He's skeptical, logical and writes well. All of these characteristics are in evidence in this book, which takes a cold hard look at the realm of predictions. Not the fortune-teller kind, but the kind made by experts and talking heads on the various current affairs shows: "What will the unemployment rate be next year?" "What do the climate models suggest our planet will look like two decades from now?" "Where's the price of oil headed?" Put simply, the thesis of this book is that the so-called experts are really no better at making predictions than a monkey throwing darts at a dart board. But if that's the case, why do we keep asking for predictions, and why do we keep thinking that THIS time the experts will be right?

Gardner explores this question in some depth, talking about how our brains evolved to process information from our environment and how what worked in the primitive world does not work quite so well in the more complex modern day. He also talks about two types of experts, which he calls "foxes" and "hedgehogs". Foxes assemble a great deal of information before they make their predictions, consider all sides of the issue, constantly question their assumptions, and are more willing to admit they are wrong when a prediction does not pan out. Hedgehogs, on the other hand, have a more narrow worldview and tend to stamp their predictions in the mould of whatever ideology they follow. They are also much less willing to admit error, which perversely makes them more popular. People prefer to listen to a confident but wrong expert over a more careful, cautious one.

Some parts felt a bit repetitive, but overall this book met my expectations. I would recommend it to news junkies in particular, especially those who enjoy picking apart media coverage or watching satire shows such as the Daily Show.
Displaying 1 - 30 of 103 reviews

Can't find what you're looking for?

Get help and learn more about the design.