Jump to ratings and reviews
Rate this book

Psychology of Intelligence Analysis

Rate this book
With intelligence now getting a front-row seat in governments around the world, this book is especially timely. Intelligence rains in, but without an understanding of the nature of the intelligence, it accumulates in puddles of obscurity. The problems therefore seem to be how to obtain it, how to understand it, and how to sell it to one's bosses. This book deals with how to understand it. Three fundamental points are at the heart of this presentation about the cognitive challenges intelligence analysts face: The mind is poorly wired to deal effectively with both inherent uncertainty (the natural fog surrounding complex, indeterminate intelligence issues) and induced uncertainty (the man-made fog fabricated by denial and deception operations). Even increased awareness of cognitive and other unmotivated biases, such as the tendency to see information confirming an already-held judgement more vividly than one sees disconfirming information, does little by itself to help analysts deal effectively with uncertainty. critical thinking can substantially improve analysis on complex issues on which information is incomplete, ambiguous, and often deliberately distorted. Key examples of such intellectual devices include techniques for structuring information, challenging assumptions, and exploring alternative interpretations. This book was first issued by the CIA.

216 pages, Paperback

First published January 1, 1999

193 people are currently reading
3929 people want to read

About the author

Richards J. Heuer Jr.

10 books38 followers
Richards "Dick" J. Heuer, Jr. is a former CIA veteran of 45 years and most known for his work on analysis of competing hypotheses and his book, Psychology of Intelligence Analysis.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
476 (47%)
4 stars
362 (35%)
3 stars
147 (14%)
2 stars
17 (1%)
1 star
6 (<1%)
Displaying 1 - 30 of 103 reviews
Profile Image for Lisa Reads & Reviews.
456 reviews129 followers
April 16, 2020
Everyone should suck it up and read this, and I don't want to hear about it being dry or boring. Not everything is meant to be entertaining. Learn to think better and we won't be as susceptible to con men and propaganda. Just do it.
Profile Image for Robert.
302 reviews
August 15, 2023
A phenomenal resource on rationality and decision-making – perhaps the best in its genre. The title undersells the book. Despite coming out of the CIA, the book contains lessons that apply far beyond intelligence analysis; I think it is a necessary read for anyone who wants to do intelligent analysis. Furthermore, it is not just about psychology – it is about the philosophy, sociology, and practicality of intelligent analysis.

The core premise of the book is that people make analytical judgments by processing sensory information through their cognitive machinery, without understanding the weaknesses of either the sensory processes or their cognitive machinery. Heuer seeks to ameliorate this by giving a concise overview of our perception and memory systems, with an emphasis on their pitfalls in the context of analytical work. After all, we presumably did not evolve on the savannah to piece together the motives of foreign nation-states.

Building from there, Heuer gives a guided tour of the required competencies of an analyst (e.g., creativity and open-mindedness), why we are generally deficient in these areas, and practical tools for improvement. This involves some epistemological detours, of a similar flavour to Sowell’s Knowledge and Decisions (which, I should say, I still haven’t finished!). I particularly enjoyed the exploration of how exactly analytical judgments can be generated, for example, using historical analogy is philosophically different to drawing on theory, but both can be valid depending on the situation.

One of the highlights of the book is the Analysis of Competing Hypotheses (ACH) framework, a simple (but not simplistic) tool for deciding between various hypotheses, which has been deliberately designed to offset various cognitive biases. Speaking of cognitive biases, the survey herein is par excellence – they are grouped by category (perceiving evidence, judging cause and effect, estimating probabilities) and Heuer finds the perfect balance of psychological background and practical exposition. As one would expect from a handbook aimed at time-constrained decision-makers, the book is exceedingly well structured and crystal clear (making Thinking, Fast and Slow feel clumsy by comparison)

It’s no surprise that Psychology of Intelligence Analysis is highly recommended in trading/investing circles – I can’t find a concept in the book that isn’t relevant to the role, and its influence is clear in other great resources like Geopolitical Alpha (whose Constraints Framework is a modified version of ACH). Really, all one would need on top of this is a similar book about the philosophy and practicality of statistical modelling (a combination of Modelling Mindsets and Regression Modelling Strategies in conjunction could get you most of the way there, but I’m still on the lookout for the definitive text).

I guess after reading all these books on decision-making and rational thinking, I’ve realised it does just boil down to what the Greeks had chiselled into the temple at Delphi – ”Know Thyself”. Richard Heuer’s book is a decisive step towards that goal!

My highlights here.
Profile Image for Ji.
175 reviews51 followers
January 17, 2023
This could be the ultimate book for data scientists (that I've read), since it is about analyzing information to form predictions of the future, and while doing so, avoiding common or uncommon pitfalls that would negative impact the results.

It took me many weeks to finish reading this book in depth. I tried to take notes when moving on. Part I is eye opening. It provided a new way for me to fundamentally understand the process of analyses. Part II is the core of the book. It's as informational as actionable. Part III is fun, but it's less surprising to me for I've known most, if not all, of them well.

Overall it's one of my greatest reads. It's as theoretical as practical, as metaphysical as factual, and as educational as fun.
Profile Image for ☘Misericordia☘ ⚡ϟ⚡⛈⚡☁ ❇️❤❣.
2,520 reviews19.2k followers
April 6, 2016
An in-depth analysis of neurological premises for data analysis and misanalysis.
Q:
PART I--OUR MENTAL MACHINERY
Chapter 1: Thinking About Thinking
Chapter 2: Perception: Why Can't We See What Is There to Be Seen?
Chapter 3: Memory: How Do We Remember What We Know?
PART II--TOOLS FOR THINKING
Chapter 4: Strategies for Analytical Judgment: Transcending the Limits of Incomplete Information
Chapter 5: Do You Really Need More Information?
Chapter 6: Keeping an Open Mind
Chapter 7: Structuring Analytical Problems
Chapter 8: Analysis of Competing Hypotheses
PART III--COGNITIVE BIASES
Chapter 9: What Are Cognitive Biases?
Chapter 10: Biases in Evaluation of Evidence
Chapter 11: Biases in Perception of Cause and Effect
Chapter 12: Biases in Estimating Probabilities
Chapter 13: Hindsight Biases in Evaluation of Intelligence Reporting
PART IV--CONCLUSIONS
Chapter 14: Improving Intelligence Analysis
Profile Image for Ci.
960 reviews6 followers
November 20, 2013
This book summarized the basic neuroscientific structure of memory and decision-making with emphasis on the potential biases and blind spots created by cognitive deficiencies as well as sub-optimal mental models. Later parts also touched upon the organizational structure to foster an environment where intelligence analysts may be encouraged to have unbiased analysis without undue internal or external motivation to avoid career risk or group-think.

The writing style is succinct and largely consistent with mainstream academic research. Consider this Intelligence 101, useful for reflection of one's habitual mental models at work, often with more bias and deficiency than we are aware.

Profile Image for Gerrit G..
90 reviews4 followers
February 23, 2018
Interesting application of cognitive psychology and decision analysis in intelligence analysis. Also this book illustrates how often biases and faulty information ruin analysis reports. The way these methods are employed and reflected upon are - I think - of interest even for analyst in other areas - such as business. You can find further references into various areas such as cognitive psychology, statistics, politics, and intelligence. Unfortunately, the use of Bayesian statistics is just mentioned, but not really applied.
Profile Image for DeAnna Knippling.
Author 172 books278 followers
October 30, 2021
An excellent, short book on how intelligence analysts, like the rest of us, screw up their assessments of situations, plus some workarounds.

Everyone has bias, but not everyone is responsible to brief the top levels of government with as little bias as possible. The author establishes that 1) having more information won't help, 2) being exact with one's assumptions, 3) learning statistics, and 4) using solid numbers, not hindsight, to establish efficacy of analysis is probably as good a set of workarounds as is possible with the human brain.

I wonder how this will get updated in the age of AI.

Recommended if you like psychology or "spy stuff."
143 reviews22 followers
September 9, 2018
Excellent book originally meant for CIA intelligence analysts but extremely useful for anyone perusing real-world information and making decisions in a complex, uncertain world - especially for investment purposes. There are many books out there on behavioural biases but this book is amongst the most clearly written, practical and applicable.
Profile Image for Don.
364 reviews
June 29, 2024
This book takes work. Practicing the ideas in the book and implementing the processes will make you better, no matter your vocation.
Profile Image for C.
1,224 reviews1,023 followers
September 28, 2022
A useful resource for improving your intelligence analysis skills through better thinking and self-aware combating of cognitive biases. The book is a collection of articles originally used in the CIA Directorate of Intelligence.

I read this because I've seen it recommended for cyber threat intelligence analysts.

You can download the free PDF.

Notes
Foreword
… information and expertise are a necessary but not sufficient means of making intelligence analysis the special product that it needs to be. A comparable effort has to be devoted to the science of analysis. This effort has to start with a clear understanding of the inherent strengths and weaknesses of the primary analytic mechanism—the human mind—and the way it processes information.
Dick Heuer makes clear that the pitfalls the human mental process sets for analysts cannot be eliminated; they are part of us. What can be done is to train people how to look for and recognize these mental obstacles, and how to develop procedures designed to offset them.
Introduction
Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.
Contributors to quality of analysis: Sherman Kent, Robert "Bob" Gates, Douglas MacEachin, Richards "Dick" Heuer.

Don't reject the possibility of deception because you don't see evidence of it; you won't see evidence of properly-executed deception.

Perception: Why Can’t We See What Is There To Be Seen?
"Initial exposure to blurred or ambiguous stimuli interferes with accurate perception even after more and better information becomes available."

Memory: How Do We Remember What We Know?
"Hardening of the categories": If people don't have an appropriate category for something. they're unlikely to perceive it or be able to remember it later. If categories are drawn incorrectly, people are likely to perceive and remember things inaccurately.

Evidence is diagnostic when it influences an analyst's judgment on the relative likelihood of various hypotheses. If an item seems inconsistent with all hypotheses, it may have no diagnostic value. Without a complete set of hypotheses, it's impossible to evaluate the "diagnosticity" of the evidence.

A hypothesis can't be proved even by a large body of evidence consistent with it, because that same body of evidence may be consistent with other hypotheses. A hypothesis can be disproved by a single item of evidence that's incompatible with it.

Do You Really Need More Information?
Once an experienced analyst has the minimum info necessary to make an informed judgment, additional info generally doesn't improve the accuracy of estimates. However, additional info leads the analyst to become more confident in the judgment (to the point of overconfidence).

Keeping an Open Mind
Questioning Assumptions: see how sensitive the judgment is to changes in the major variables; try to disprove assumptions; get alternative interpretations from those who disagree with you; don't assume the other side thinks the same way you do (mirror-imaging).

Seeing Different Perspectives: imagine yourself in the future, explaining how the event could've happened; explain how your assumptions could be wrong; mentally put yourself in someone else's place; find a "devil's advocate" to critique your views.

Creative thinking techniques
• Deferred Judgment: generate all ideas first, then evaluate them
• Quantity Leads to Quality: quantity of ideas eventually leads to quality; 1st ideas are usually most common or usual
• No Self-Imposed Constraints: generate ideas without self-imposed constraints
• Cross-Fertilization of Ideas: combine ideas and interact with other analysts

Structuring Analytical Problems
Multiattribute Utility Analysis
1. List attributes you want to maximize
2. Quantify relative importance of each attribute, to add up to 100%
3. For each option you're considering, rate it on each attribute
4. Calculate which option best fits your preferences

Analysis of Competing Hypotheses
Analysis of competing hypotheses (ACH) requires an analyst to explicitly identify all the reasonable alternatives and have them compete against each other for the analyst’s favor, rather than evaluating their plausibility one at a time.
ACH steps
1. Identify the possible hypotheses to be considered. Use a group of analysts with different perspectives to brainstorm the possibilities.
2. Make a list of significant evidence and arguments for and against each hypothesis.
3. Prepare a matrix with hypotheses across the top and evidence down the side. Analyze the “diagnosticity” of the evidence and arguments— that is, identify which items are most helpful in judging the relative likelihood of the hypotheses.
4. Refine the matrix. Reconsider the hypotheses and delete evidence and arguments that have no diagnostic value.
5. Draw tentative conclusions about the relative likelihood of each hypothesis. Proceed by trying to disprove the hypotheses rather than prove them.
6. Analyze how sensitive your conclusion is to a few critical items of evidence. Consider the consequences for your analysis if that evidence were wrong, misleading, or subject to a different interpretation.
7. Report conclusions. Discuss the relative likelihood of all the hypotheses, not just the most likely one.
8. Identify milestones for future observation that may indicate events are taking a different course than expected.

An unproven hypothesis has no evidence that it's correct. A disproved hypothesis has positive evidence that it's wrong.

When you're tempted to write, "There's no evidence that … ," ask yourself, "If this hypothesis is true, can I realistically expect to see evidence of it?"
This procedure leads you through a rational, systematic process that avoids some common analytical pitfalls. It in- creases the odds of getting the right answer, and it leaves an audit trail showing the evidence used in your analysis and how this evidence was interpreted. If others disagree with your judgment, the matrix can be used to highlight the precise area of disagreement. Subsequent discussion can then focus productively on the ultimate source of the differences.
What Are Cognitive Biases?
Cognitive biases are mental errors caused by subconscious mental procedures for processing info. They're not caused by emotional or intellectual predisposition toward a certain judgment, unlike cultural bias, organizational bias, or bias from one’s self-interest.

Biases in Evaluation of Evidence
The Vividness Criterion: Give little weight to anecdotes and personal case histories, unless they're known to be typical. Give them no weight if aggregate data based on a more valid sample is available.

Biases in Perception of Cause and Effect
People overestimate the extent to which other countries are pursuing a coherent, coordinated, rational plan, and thus also overestimate their own ability to predict future events in those nations. People also tend to assume that causes are similar to their effects, in the sense that important or large effects must have large causes.
When inferring the causes of behavior, too much weight is accorded to personal qualities and dispositions of the actor and not enough to situational determinants of the actor’s behavior. People also overestimate their own importance as both a cause and a target of the behavior of others. Finally, people often perceive relationships that do not in fact exist, because they do not have an intuitive understanding of the kinds and amount of information needed to prove a relationship.
Bias in Favor of Causal Explanations: Random events often look patterned.

Bias Favoring Perception of Centralized Direction: A country's inconsistent policies may be the result of weak leadership, vacillation, or bargaining among bureaucratic or political interests, rather than duplicity or Machiavellian maneuvers.

Similarity of Cause and Effect: Major effects may be the result of mistakes, accidents, or aberrant behavior of an individual, rather than major causes.

Internal vs. External Causes of Behavior
• Don't overestimate the effect of a person's or government's internal personality or disposition on their behavior, and don't underestimate the effect of their response to external situational constraints.
• Don't overestimate the effect of your response to your situation on your behavior, and don't underestimate the effect of your personality.

Overestimating Our Own Importance: Don't overestimate the likelihood that actions that hurt you were intentionally directed at you, and don't underestimate the likelihood that those actions were the unintended consequences of decisions not related to you.

Illusory Correlation
• To determine a causal relationship, you must build a 2 x 2 contingency table that shows a strong relationship between factors A, B, Not A, and Not B.
• There's not enough data to say there's a relationship between deception and high-stakes situations.

Biases in Estimating Probabilities
Anchoring: The final estimate lands close to the initial estimate. To combat it, consciously avoid using prior judgments as a starting point, or use formal statistical procedures.

Expression of Uncertainty: After vague expressions ("possible," "probable," "unlikely," "may," "could," etc.), put the estimated odds or percentage range in parentheses.

Assessing Probability of a Scenario: multiply the probabilities of each individual event.

Hindsight Biases in Evaluation of Intelligence Reporting
Hindsight biases: Analysts normally overestimate the accuracy of their past judgments. Postmortems normally judge that events were more foreseeable than they were.

To overcome hindsight biases, remind yourself of the uncertainty prior to a situation by asking yourself, "If the opposite outcome had occurred, would I have been surprised? If this report had told me the opposite, would I have believed it? If the opposite outcome had occurred, would it have been predictable given the info available at the time?"

Improving Intelligence Analysis
Analytical process
1. Defining the problem: be sure to ask the right questions
2. Generating hypotheses: identify all plausible hypotheses, then reduce them to a workable number of reasonable hypotheses
3. Collecting information: collect info to evaluate all reasonable hypotheses
4. Evaluating hypotheses: look for evidence to disprove hypotheses; consider using ACH
5. Selecting the most likely hypothesis: choose the hypothesis with the least evidence against it; list other hypotheses and why they were rejected
6. Ongoing monitoring of new information: specify criteria that would require reevaluation of hypotheses
Profile Image for Faith.
19 reviews3 followers
February 27, 2022
Worth a read especially for those involved in Threat Intelligence analysis. Richard addresses ways of improving intelligence analysis by providing an analysis of how humans think and how perception, bias and memory impacts our thinking. He also looks at how we can improve analytical judgement using structured analytical techniques.

Short and sweet.
Profile Image for Wells Benjamin.
11 reviews
November 30, 2023
Wonderful examination of the cognition that lies behind any analysis, necessary reading for anyone wanting to go into the field or that just wants to improve their analytical abilities. Heuer is a concise and knowledgeable writer worthy of praise.
78 reviews20 followers
April 2, 2021
The author is a former CIA intelligence analyst who observed the pitfalls of doing analysis and came up with some ideas to overcome them.

Analysis can be improved in a number of obvious ways: collecting more and better information, more concise and clear writing, asking better questions or streamlining the analysis process. His book does not discuss any of those instead addressing what was an underappreciated factor of analysis at the time of writing. He investigates how our mind's constraints and biases influence the outcome of our analysis. In recent years this has become a popular topic with books like "thinking fast and slow", or "superforecasting" but they are light on practical advice. This book (and others from the intelligence community) are more oriented towards actionable advice.

The author begins by examining mental processes and how they put constraints on our ability to analyze complex and uncertain situations. These constraints create cognitive biases. He describes studies and research to give examples of how cognitive biases factor into analysis. Unfortunately awareness of these cognitive biases does not help us to mitigate them.

To overcome our limitations we need a process to guide our analysis. He emphasizes the need to come up with perspectives that are outside of our mental model of the world, region or situation.
- To bypass the constraints on mind and memory, we need to externalize complex pieces of analysis.
- Unless we are unfamiliar with the topic, gathering more data is not a good strategy. It increases our confidence in our prevailing hypothesis but adds no signal. It is more valuable to create a number of hypotheses first and let them guide our search for information. In a study with medical students evaluating patients, those who formed hypothesis and tested them along the way performed better than those who tried to gather as much information as possible beforehand.
- Start by generating hypotheses without judgment. It is difficult for people to see alternative perspective so consult others in this process. Every analyst has their own set of mental models through which they observe and analyze the world. Challenging our assumptions and beliefs through a process of evaluating alternative beliefs is his most important idea.
- After coming up with hypotheses, try to disprove them rather than looking for confirmatory evidence. The most likely hypothesis is that for which there is the least amount of disconfirming evidence.
- When communicating results, be clear about uncertainty, confidence and process of analysis. Vague language means different things to different people and they tend to fill it in with their assumptions. Analysts also need to be clear about what assumptions are being made. The only way of getting better is to have precise language and transparent analysis allowing for feedback on what was right and wrong.
- Identify milestones that should be tracked to evaluate the performance of the analysis. There are two particularly important types of new information that we need to react to: changes in variables that are used in our model of the situation; and information that tells us that our mental model is not right for the problem being analyzed.

There are many awesome insights and recommendations for organizations and analysts. I can see myself reading through this again at a later time to brush up on ways to improve.

I strongly believe that analysis and decision-making skills will become one of the most sought after skills in the coming years. Many fields are starting to measure decision-making performance but there are no general frameworks for best practice. Great ideas and insights are spread across a wide range of books and fields (intelligence, investing, medicine, science, history & law) that still need to be incorporated into an overarching framework. Heuer's insights into the process of analysis align with what I have learned so far, and help me to develop my framework further. I found his process of analysis with an emphasis on alternative perspectives very compelling and will try to incorporate them into my process going forward.
46 reviews1 follower
March 7, 2017
Easy to read and very thought provoking. Lots of the behavioural biases discussed are well understood but few books put them into a practical framework like this.
Profile Image for Sanjay Banerjee.
525 reviews12 followers
November 22, 2024
I picked this book up to read as the author of “Making of Kim Jong Un” (who herself was a CIA analyst earlier) mentioned that this was a reference handbook for all CIA analysts who were required to provide their forecast for the future with often ambiguous and inadequate and disparate information available to them and how cognitive biases affected their judgement. The title of this book explains the subject and is an easy book to read with plenty of food for thought for even others who may not be analysts with intelligence agencies.
Profile Image for Peter.
214 reviews22 followers
December 28, 2018
Originally published as a series of articles from 1978-86, this book feels extremely contemporary, and makes you wonder...why are all of these pop science books so surprising!

This book was targeted at CIA intel analysts, but has fairly wide applicability to anyone who is trying to figure out how the world works, and especially for those with the humility to appreciate the limits of their ability to do so. It's a quick read, probably one of the most consumable and practical handbooks about cognitive biases and our natural weakness for statistics.

One of my favorite concepts was the idea of a model as a mechanism for learning: codifying a model that disagrees with your intuition often means that your intuition is pricing some variable you haven't consciously thought about. Thus, model development is framed as a problem-solving technique, rather than a technique to guarantee a solution!
73 reviews7 followers
June 14, 2020
5.0/5.0

Psychology of Intelligence Analysis is one of those rare books which over-delivers what its title suggests. A more appropriate title would have been Psychology of Analysis, because so many of the topics discussed in this book transcend the field of national intelligence. IMHO, chapters of this book should be required reading for all freshmen in all colleges. It is an excellent resource to help us think about how we think.

I got introduced to this book from a reference in another much more recent but much more stupid book (Becoming Kim Jong Un) written by another CIA officer. Psychology of Intelligence Analysis is concise, well-researched, well-written, well-referenced, and convincing. I will not do it justice by attempting to summarize it. But I have learned more per word count from this book than most (perhaps any) other books I’ve read.
Profile Image for bumbu.
28 reviews2 followers
April 27, 2019
Overall interesting but very hard/boring to read.
Even thought the book is focused on intelligence analysis, one can adapt the same approaches described in the book to other domains. As the book focuses a lot on how humans perceive information and build their own versions of truth - this information can even be used to better understand judgements made by ourselves and those around us. It could also probably be used to persuade others to believe certain truths by using exactly the same techniques.
The last chapter of the book summarises the book pretty nicely - so by just reading that, one can grasp the main idea from the book.
Profile Image for Tom.
385 reviews33 followers
November 13, 2010
An excellent discussion of how mental models influence individual analyses along with recommendations for overcoming them and reducing their influences. IT is well supported by examples and summaries of experiments from many different fields.
46 reviews3 followers
July 22, 2019
این کتاب با عنوان «روانشناسی تحلیل اطلاعات» توسط وزارت خارجه ترجمه و چاپ شده و کتابی فوقالعاده است با موضوع خطاهای شناختی تو تحلیل اطلاعات و این که چطور باهاشون کنار بیایم. کتابی هست که بهتره ازش یادداشت بردارید تا بهتر بفهمیدش
Profile Image for Mike.
Author 8 books92 followers
October 19, 2011
This is a very good book about the difficulties associated with acccurate intelligence gathering and analysis. The book was a textbook for training CIA agents.
Profile Image for Henry.
79 reviews5 followers
February 21, 2012
This is a good tool for the beginning intelligence professional regardless of speciality. It is also good for more experienced professionals to read as a refresher.
Profile Image for Samantha.
23 reviews
June 4, 2015
A long, wordy, yet necessary read for anyone who wishes to pursue any further into the study of "Intelligence analysis".
Profile Image for Todd.
145 reviews8 followers
Currently reading
July 20, 2009
Thinking about thinking.
156 reviews12 followers
January 11, 2021
I was recommended this book by a coworker. He told me it was relevant to our job as investment analysts. His argued yo could substitute the word Investment for Intelligence anywhere in the book and the information was still valid. I agree.

This is an insightful book, worth reading from cover to cover.

However, it bears saying that much of the contents of this book has been written elsewhere in the thirty or so years since it originally came out and what would appear to be fifty or so years since the author was doing the research.

Heuer says one thing that I've never seen anywhere else and it's quite valuable: Structured Analysis (Chapter 8). For much of the book Heuer builds up to this chapter (before) or explains pitfalls you will encounter when putting it into place (after).

So what's Structured Analysis?

The basic observation: When you go to prove something, you develop a hypothesis and then gather evidence to support it. As a result, you tend to prove your hypotheses and are surprised when it turns out to be false.

'But I found all this evidence to support it?!' you can almost hear the defensive analyst proclaiming.


The problem: It's easy to gather evidence in support of many credible hypotheses. Even extremely unlikely results can be supported by numerous available facts. If you have one core hypothesis and are looking to support it, you probably will! And as a result, your error rate is too high if you're doing it this way.


The solution: 1) Lay out several competing credible hypotheses; 2) Develop factual areas to explore; 3) determine whether these areas can *dis* prove credible hypotheses, rather than proving them; 4) do enough research to *dis*prove most of the hypotheses you have created; 5) the hypothesis left with the least evidence against it is your best guess.

This makes sense because prediction isn't like science. In science, you control your environment and run tests until you isolate the relationship between variables and responses. In social prediction, whether investing or intelligence, you are trying to ascertain the likeliest outcome from a cloud of probabilities. This calls for humility, which we all know, but Heuer tells us what steps are required to put humility into action.

Four stars!


One final note: dear publisher, this book can be more readable than it is. Contact me if there is an opportunity for freelance editing.
Profile Image for Craig Martin.
131 reviews3 followers
February 25, 2024
This is a fascinating book which serves lots of disciplines. The author comes from a US government intelligence background but covers issues relating to poor decision-making in our daily lives and the impact of cognitive biases.

We tend to think of the human mind as the most significant object in the universe (perhaps the most complex object). The human mind is, however, limited in many ways. Whether through the limits of its initial design or a path of evolution, it can become stuck in our modern daily life when dealing with ambiguity and fluid circumstances.

Heuer provides some tools for extracting the salient information from a situation for establishing and choosing among competing hypotheses. He covers ground familiar to Danny Kahneman and Amos Tversky fans, including anchoring, availability bias and base rate fallacy, but from an efficient perspective. His book won't be on the enticing stack at airport bookshops, but it deserves to be read.

He encourages the reader to explore as many ideas as possible and to keep an open mind when considering 'intelligence and information'. He says: ‘Minds are like parachutes - they only function when they are open.’

The book takes some effort; it isn't silky prose, but the tools and techniques illuminate. Some examples the book gives relate to significant decisions made by the US government based on poor reasoning (or assuming that is hindsight bias, perhaps based on cognitive shortcuts). It raises but doesn't answer whether we can overcome the cognitive biases that impact our decision-making. Even though we know about cognitive biases, they seem to operate at the subconscious level. For governments, the decisions made under these biases can have historical consequences, leading to wars of the cold and warm types. The consequences are often far less reaching for each of us in our daily lives. With his book, however, the process of decision-making, weighing evidence, and 'intelligence' can hopefully be improved.

I gave it four stars.
Profile Image for Chris Purdy.
2 reviews1 follower
July 16, 2025
Honestly, even though this is intended for members of the Intelligence Community, I would gladly recommend this to anybody involved in any kind of analytical profession, period.

At the end of the day, this book is a primer on cognitive psychology and how we, as humans, are inclined to make judgments and decisions in the absence of complete information. How timely!

Even though the original material is a few decades old, the insights are just as relevant as ever. There have certainly been developments in the field of Cognitive Science since then, but the findings and case studies referenced here are still valid and consistent with contemporary thought: namely, that we are not nearly as good at objective analysis as we think we are!

This is especially true for subject-matter experts!

-----‐----

Even though the scientific *depth* of content in this text isn't is strong as what you'd find in academic publications, the *breadth* and *scope* of its discussion more than justifies its existence. In this sense, it completely succeeds at its stated intent: to serve as a handbook for practitioners of intelligence analysis. We are given a broad and widely-applicable set of descriptions, remedies, and suggestions that I really do think would benefit analysts everywhere. (I wish I had something like this when I was in grad school...)

---As an aside, I deeply appreciate the breadth and diversity of citations strewn within the text. There is so much interesting follow-up material that one can rabbit-hole into if they find themselves hankering for more detailed characterizations of cognitive biases, historical case studies, etc.

I will definitely be adding this book to my de facto list of recommendations for those invested in cognitive psychology, metacognition, or heck, even Philosophy of Science.
Displaying 1 - 30 of 103 reviews

Can't find what you're looking for?

Get help and learn more about the design.