Jump to ratings and reviews
Rate this book

Psychology of Intelligence Analysis

Rate this book
This volume pulls together and republishes, with some editing, updating, and additions, articles written during 1978-86 for internal use within the CIA Directorate of Intelligence. The information is relatively timeless and still relevant to the never-ending quest for better analysis. The articles are based on reviewing cognitive psychology literature concerning how people process information to make judgments on incomplete and ambiguous information. Richard Heur has selected the experiments and findings that seem most relevant to intelligence analysis and most in need of communication to intelligence analysts. He then translates the technical reports into language that intelligence analysts can understand and interpreted the relevance of these findings to the problems intelligence analysts face.

210 pages, Paperback

First published January 1, 1999

207 people are currently reading
4479 people want to read

About the author

Richards J. Heuer Jr.

10 books39 followers
Richards "Dick" J. Heuer, Jr. is a former CIA veteran of 45 years and most known for his work on analysis of competing hypotheses and his book, Psychology of Intelligence Analysis.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
517 (47%)
4 stars
398 (36%)
3 stars
158 (14%)
2 stars
18 (1%)
1 star
7 (<1%)
Displaying 1 - 30 of 115 reviews
Profile Image for Lisa Reads & Reviews.
464 reviews130 followers
April 16, 2020
Everyone should suck it up and read this, and I don't want to hear about it being dry or boring. Not everything is meant to be entertaining. Learn to think better and we won't be as susceptible to con men and propaganda. Just do it.
Profile Image for Robert.
302 reviews
August 15, 2023
A phenomenal resource on rationality and decision-making – perhaps the best in its genre. The title undersells the book. Despite coming out of the CIA, the book contains lessons that apply far beyond intelligence analysis; I think it is a necessary read for anyone who wants to do intelligent analysis. Furthermore, it is not just about psychology – it is about the philosophy, sociology, and practicality of intelligent analysis.

The core premise of the book is that people make analytical judgments by processing sensory information through their cognitive machinery, without understanding the weaknesses of either the sensory processes or their cognitive machinery. Heuer seeks to ameliorate this by giving a concise overview of our perception and memory systems, with an emphasis on their pitfalls in the context of analytical work. After all, we presumably did not evolve on the savannah to piece together the motives of foreign nation-states.

Building from there, Heuer gives a guided tour of the required competencies of an analyst (e.g., creativity and open-mindedness), why we are generally deficient in these areas, and practical tools for improvement. This involves some epistemological detours, of a similar flavour to Sowell’s Knowledge and Decisions (which, I should say, I still haven’t finished!). I particularly enjoyed the exploration of how exactly analytical judgments can be generated, for example, using historical analogy is philosophically different to drawing on theory, but both can be valid depending on the situation.

One of the highlights of the book is the Analysis of Competing Hypotheses (ACH) framework, a simple (but not simplistic) tool for deciding between various hypotheses, which has been deliberately designed to offset various cognitive biases. Speaking of cognitive biases, the survey herein is par excellence – they are grouped by category (perceiving evidence, judging cause and effect, estimating probabilities) and Heuer finds the perfect balance of psychological background and practical exposition. As one would expect from a handbook aimed at time-constrained decision-makers, the book is exceedingly well structured and crystal clear (making Thinking, Fast and Slow feel clumsy by comparison)

It’s no surprise that Psychology of Intelligence Analysis is highly recommended in trading/investing circles – I can’t find a concept in the book that isn’t relevant to the role, and its influence is clear in other great resources like Geopolitical Alpha (whose Constraints Framework is a modified version of ACH). Really, all one would need on top of this is a similar book about the philosophy and practicality of statistical modelling (a combination of Modelling Mindsets and Regression Modelling Strategies in conjunction could get you most of the way there, but I’m still on the lookout for the definitive text).

I guess after reading all these books on decision-making and rational thinking, I’ve realised it does just boil down to what the Greeks had chiselled into the temple at Delphi – ”Know Thyself”. Richard Heuer’s book is a decisive step towards that goal!

My highlights here.
Profile Image for Ji.
175 reviews52 followers
January 17, 2023
This could be the ultimate book for data scientists (that I've read), since it is about analyzing information to form predictions of the future, and while doing so, avoiding common or uncommon pitfalls that would negative impact the results.

It took me many weeks to finish reading this book in depth. I tried to take notes when moving on. Part I is eye opening. It provided a new way for me to fundamentally understand the process of analyses. Part II is the core of the book. It's as informational as actionable. Part III is fun, but it's less surprising to me for I've known most, if not all, of them well.

Overall it's one of my greatest reads. It's as theoretical as practical, as metaphysical as factual, and as educational as fun.
Profile Image for Ali.
472 reviews
April 19, 2026
I picked this up in the heat of debates around AI in the kill chain (Anthropic bailing out from the Maven Project). The content is older than my age, goes way back before AI or behavioral economics books, but lays out the same basics of processing incomplete (mis/dis)information, decision making in uncertainty, analysis of alternatives (or competing hypotheses), considering cognitive biases, and of course briefing up in a concise format factoring in all of the above. Heuer's analysis addresses the same DSS or decision science issues that we face in the age of AI with questions around the reliability of LLMs, need for human-in-the-loop, weights of your neural networks, mixture of experts,.. The target audience of this text is intelligence community (as is an old CIA textbook) but content could be used for a wider context make it a more readable clear-thinking book.
Profile Image for ☘Misericordia☘ ⚡ϟ⚡⛈⚡☁ ❇️❤❣.
2,584 reviews19.2k followers
April 6, 2016
An in-depth analysis of neurological premises for data analysis and misanalysis.
Q:
PART I--OUR MENTAL MACHINERY
Chapter 1: Thinking About Thinking
Chapter 2: Perception: Why Can't We See What Is There to Be Seen?
Chapter 3: Memory: How Do We Remember What We Know?
PART II--TOOLS FOR THINKING
Chapter 4: Strategies for Analytical Judgment: Transcending the Limits of Incomplete Information
Chapter 5: Do You Really Need More Information?
Chapter 6: Keeping an Open Mind
Chapter 7: Structuring Analytical Problems
Chapter 8: Analysis of Competing Hypotheses
PART III--COGNITIVE BIASES
Chapter 9: What Are Cognitive Biases?
Chapter 10: Biases in Evaluation of Evidence
Chapter 11: Biases in Perception of Cause and Effect
Chapter 12: Biases in Estimating Probabilities
Chapter 13: Hindsight Biases in Evaluation of Intelligence Reporting
PART IV--CONCLUSIONS
Chapter 14: Improving Intelligence Analysis
Profile Image for Ci.
960 reviews6 followers
November 20, 2013
This book summarized the basic neuroscientific structure of memory and decision-making with emphasis on the potential biases and blind spots created by cognitive deficiencies as well as sub-optimal mental models. Later parts also touched upon the organizational structure to foster an environment where intelligence analysts may be encouraged to have unbiased analysis without undue internal or external motivation to avoid career risk or group-think.

The writing style is succinct and largely consistent with mainstream academic research. Consider this Intelligence 101, useful for reflection of one's habitual mental models at work, often with more bias and deficiency than we are aware.

Profile Image for DeAnna Knippling.
Author 176 books284 followers
October 30, 2021
An excellent, short book on how intelligence analysts, like the rest of us, screw up their assessments of situations, plus some workarounds.

Everyone has bias, but not everyone is responsible to brief the top levels of government with as little bias as possible. The author establishes that 1) having more information won't help, 2) being exact with one's assumptions, 3) learning statistics, and 4) using solid numbers, not hindsight, to establish efficacy of analysis is probably as good a set of workarounds as is possible with the human brain.

I wonder how this will get updated in the age of AI.

Recommended if you like psychology or "spy stuff."
Profile Image for Gerrit G..
90 reviews4 followers
February 23, 2018
Interesting application of cognitive psychology and decision analysis in intelligence analysis. Also this book illustrates how often biases and faulty information ruin analysis reports. The way these methods are employed and reflected upon are - I think - of interest even for analyst in other areas - such as business. You can find further references into various areas such as cognitive psychology, statistics, politics, and intelligence. Unfortunately, the use of Bayesian statistics is just mentioned, but not really applied.
144 reviews22 followers
September 9, 2018
Excellent book originally meant for CIA intelligence analysts but extremely useful for anyone perusing real-world information and making decisions in a complex, uncertain world - especially for investment purposes. There are many books out there on behavioural biases but this book is amongst the most clearly written, practical and applicable.
Profile Image for Chad.
1,293 reviews1,045 followers
September 28, 2022
A useful resource for improving your intelligence analysis skills through better thinking and self-aware combating of cognitive biases. The book is a collection of articles originally used in the CIA Directorate of Intelligence.

I read this because I've seen it recommended for cyber threat intelligence analysts.

You can download the free PDF.

Notes
Foreword
… information and expertise are a necessary but not sufficient means of making intelligence analysis the special product that it needs to be. A comparable effort has to be devoted to the science of analysis. This effort has to start with a clear understanding of the inherent strengths and weaknesses of the primary analytic mechanism—the human mind—and the way it processes information.
Dick Heuer makes clear that the pitfalls the human mental process sets for analysts cannot be eliminated; they are part of us. What can be done is to train people how to look for and recognize these mental obstacles, and how to develop procedures designed to offset them.
Introduction
Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.
Contributors to quality of analysis: Sherman Kent, Robert "Bob" Gates, Douglas MacEachin, Richards "Dick" Heuer.

Don't reject the possibility of deception because you don't see evidence of it; you won't see evidence of properly-executed deception.

Perception: Why Can’t We See What Is There To Be Seen?
"Initial exposure to blurred or ambiguous stimuli interferes with accurate perception even after more and better information becomes available."

Memory: How Do We Remember What We Know?
"Hardening of the categories": If people don't have an appropriate category for something. they're unlikely to perceive it or be able to remember it later. If categories are drawn incorrectly, people are likely to perceive and remember things inaccurately.

Evidence is diagnostic when it influences an analyst's judgment on the relative likelihood of various hypotheses. If an item seems inconsistent with all hypotheses, it may have no diagnostic value. Without a complete set of hypotheses, it's impossible to evaluate the "diagnosticity" of the evidence.

A hypothesis can't be proved even by a large body of evidence consistent with it, because that same body of evidence may be consistent with other hypotheses. A hypothesis can be disproved by a single item of evidence that's incompatible with it.

Do You Really Need More Information?
Once an experienced analyst has the minimum info necessary to make an informed judgment, additional info generally doesn't improve the accuracy of estimates. However, additional info leads the analyst to become more confident in the judgment (to the point of overconfidence).

Keeping an Open Mind
Questioning Assumptions: see how sensitive the judgment is to changes in the major variables; try to disprove assumptions; get alternative interpretations from those who disagree with you; don't assume the other side thinks the same way you do (mirror-imaging).

Seeing Different Perspectives: imagine yourself in the future, explaining how the event could've happened; explain how your assumptions could be wrong; mentally put yourself in someone else's place; find a "devil's advocate" to critique your views.

Creative thinking techniques
• Deferred Judgment: generate all ideas first, then evaluate them
• Quantity Leads to Quality: quantity of ideas eventually leads to quality; 1st ideas are usually most common or usual
• No Self-Imposed Constraints: generate ideas without self-imposed constraints
• Cross-Fertilization of Ideas: combine ideas and interact with other analysts

Structuring Analytical Problems
Multiattribute Utility Analysis
1. List attributes you want to maximize
2. Quantify relative importance of each attribute, to add up to 100%
3. For each option you're considering, rate it on each attribute
4. Calculate which option best fits your preferences

Analysis of Competing Hypotheses
Analysis of competing hypotheses (ACH) requires an analyst to explicitly identify all the reasonable alternatives and have them compete against each other for the analyst’s favor, rather than evaluating their plausibility one at a time.
ACH steps
1. Identify the possible hypotheses to be considered. Use a group of analysts with different perspectives to brainstorm the possibilities.
2. Make a list of significant evidence and arguments for and against each hypothesis.
3. Prepare a matrix with hypotheses across the top and evidence down the side. Analyze the “diagnosticity” of the evidence and arguments— that is, identify which items are most helpful in judging the relative likelihood of the hypotheses.
4. Refine the matrix. Reconsider the hypotheses and delete evidence and arguments that have no diagnostic value.
5. Draw tentative conclusions about the relative likelihood of each hypothesis. Proceed by trying to disprove the hypotheses rather than prove them.
6. Analyze how sensitive your conclusion is to a few critical items of evidence. Consider the consequences for your analysis if that evidence were wrong, misleading, or subject to a different interpretation.
7. Report conclusions. Discuss the relative likelihood of all the hypotheses, not just the most likely one.
8. Identify milestones for future observation that may indicate events are taking a different course than expected.

An unproven hypothesis has no evidence that it's correct. A disproved hypothesis has positive evidence that it's wrong.

When you're tempted to write, "There's no evidence that … ," ask yourself, "If this hypothesis is true, can I realistically expect to see evidence of it?"
This procedure leads you through a rational, systematic process that avoids some common analytical pitfalls. It in- creases the odds of getting the right answer, and it leaves an audit trail showing the evidence used in your analysis and how this evidence was interpreted. If others disagree with your judgment, the matrix can be used to highlight the precise area of disagreement. Subsequent discussion can then focus productively on the ultimate source of the differences.
What Are Cognitive Biases?
Cognitive biases are mental errors caused by subconscious mental procedures for processing info. They're not caused by emotional or intellectual predisposition toward a certain judgment, unlike cultural bias, organizational bias, or bias from one’s self-interest.

Biases in Evaluation of Evidence
The Vividness Criterion: Give little weight to anecdotes and personal case histories, unless they're known to be typical. Give them no weight if aggregate data based on a more valid sample is available.

Biases in Perception of Cause and Effect
People overestimate the extent to which other countries are pursuing a coherent, coordinated, rational plan, and thus also overestimate their own ability to predict future events in those nations. People also tend to assume that causes are similar to their effects, in the sense that important or large effects must have large causes.
When inferring the causes of behavior, too much weight is accorded to personal qualities and dispositions of the actor and not enough to situational determinants of the actor’s behavior. People also overestimate their own importance as both a cause and a target of the behavior of others. Finally, people often perceive relationships that do not in fact exist, because they do not have an intuitive understanding of the kinds and amount of information needed to prove a relationship.
Bias in Favor of Causal Explanations: Random events often look patterned.

Bias Favoring Perception of Centralized Direction: A country's inconsistent policies may be the result of weak leadership, vacillation, or bargaining among bureaucratic or political interests, rather than duplicity or Machiavellian maneuvers.

Similarity of Cause and Effect: Major effects may be the result of mistakes, accidents, or aberrant behavior of an individual, rather than major causes.

Internal vs. External Causes of Behavior
• Don't overestimate the effect of a person's or government's internal personality or disposition on their behavior, and don't underestimate the effect of their response to external situational constraints.
• Don't overestimate the effect of your response to your situation on your behavior, and don't underestimate the effect of your personality.

Overestimating Our Own Importance: Don't overestimate the likelihood that actions that hurt you were intentionally directed at you, and don't underestimate the likelihood that those actions were the unintended consequences of decisions not related to you.

Illusory Correlation
• To determine a causal relationship, you must build a 2 x 2 contingency table that shows a strong relationship between factors A, B, Not A, and Not B.
• There's not enough data to say there's a relationship between deception and high-stakes situations.

Biases in Estimating Probabilities
Anchoring: The final estimate lands close to the initial estimate. To combat it, consciously avoid using prior judgments as a starting point, or use formal statistical procedures.

Expression of Uncertainty: After vague expressions ("possible," "probable," "unlikely," "may," "could," etc.), put the estimated odds or percentage range in parentheses.

Assessing Probability of a Scenario: multiply the probabilities of each individual event.

Hindsight Biases in Evaluation of Intelligence Reporting
Hindsight biases: Analysts normally overestimate the accuracy of their past judgments. Postmortems normally judge that events were more foreseeable than they were.

To overcome hindsight biases, remind yourself of the uncertainty prior to a situation by asking yourself, "If the opposite outcome had occurred, would I have been surprised? If this report had told me the opposite, would I have believed it? If the opposite outcome had occurred, would it have been predictable given the info available at the time?"

Improving Intelligence Analysis
Analytical process
1. Defining the problem: be sure to ask the right questions
2. Generating hypotheses: identify all plausible hypotheses, then reduce them to a workable number of reasonable hypotheses
3. Collecting information: collect info to evaluate all reasonable hypotheses
4. Evaluating hypotheses: look for evidence to disprove hypotheses; consider using ACH
5. Selecting the most likely hypothesis: choose the hypothesis with the least evidence against it; list other hypotheses and why they were rejected
6. Ongoing monitoring of new information: specify criteria that would require reevaluation of hypotheses
Profile Image for Faith.
19 reviews3 followers
February 27, 2022
Worth a read especially for those involved in Threat Intelligence analysis. Richard addresses ways of improving intelligence analysis by providing an analysis of how humans think and how perception, bias and memory impacts our thinking. He also looks at how we can improve analytical judgement using structured analytical techniques.

Short and sweet.
Profile Image for Arthur Auskern.
18 reviews1 follower
April 24, 2026
Почти во всех примерах "как не надо" проблема была не в неспособности разведки оценить вероятность альтернативных гипотез, а в насаждаемой политическим руководством культуре где непредвзятая оценка вредна для карьеры, если неприятна начальству, но этот вопрос в книге даже не поднимается.
И есть убедительная гипотеза почему.
Profile Image for Wells Benjamin.
13 reviews
November 30, 2023
Wonderful examination of the cognition that lies behind any analysis, necessary reading for anyone wanting to go into the field or that just wants to improve their analytical abilities. Heuer is a concise and knowledgeable writer worthy of praise.
78 reviews21 followers
April 2, 2021
The author is a former CIA intelligence analyst who observed the pitfalls of doing analysis and came up with some ideas to overcome them.

Analysis can be improved in a number of obvious ways: collecting more and better information, more concise and clear writing, asking better questions or streamlining the analysis process. His book does not discuss any of those instead addressing what was an underappreciated factor of analysis at the time of writing. He investigates how our mind's constraints and biases influence the outcome of our analysis. In recent years this has become a popular topic with books like "thinking fast and slow", or "superforecasting" but they are light on practical advice. This book (and others from the intelligence community) are more oriented towards actionable advice.

The author begins by examining mental processes and how they put constraints on our ability to analyze complex and uncertain situations. These constraints create cognitive biases. He describes studies and research to give examples of how cognitive biases factor into analysis. Unfortunately awareness of these cognitive biases does not help us to mitigate them.

To overcome our limitations we need a process to guide our analysis. He emphasizes the need to come up with perspectives that are outside of our mental model of the world, region or situation.
- To bypass the constraints on mind and memory, we need to externalize complex pieces of analysis.
- Unless we are unfamiliar with the topic, gathering more data is not a good strategy. It increases our confidence in our prevailing hypothesis but adds no signal. It is more valuable to create a number of hypotheses first and let them guide our search for information. In a study with medical students evaluating patients, those who formed hypothesis and tested them along the way performed better than those who tried to gather as much information as possible beforehand.
- Start by generating hypotheses without judgment. It is difficult for people to see alternative perspective so consult others in this process. Every analyst has their own set of mental models through which they observe and analyze the world. Challenging our assumptions and beliefs through a process of evaluating alternative beliefs is his most important idea.
- After coming up with hypotheses, try to disprove them rather than looking for confirmatory evidence. The most likely hypothesis is that for which there is the least amount of disconfirming evidence.
- When communicating results, be clear about uncertainty, confidence and process of analysis. Vague language means different things to different people and they tend to fill it in with their assumptions. Analysts also need to be clear about what assumptions are being made. The only way of getting better is to have precise language and transparent analysis allowing for feedback on what was right and wrong.
- Identify milestones that should be tracked to evaluate the performance of the analysis. There are two particularly important types of new information that we need to react to: changes in variables that are used in our model of the situation; and information that tells us that our mental model is not right for the problem being analyzed.

There are many awesome insights and recommendations for organizations and analysts. I can see myself reading through this again at a later time to brush up on ways to improve.

I strongly believe that analysis and decision-making skills will become one of the most sought after skills in the coming years. Many fields are starting to measure decision-making performance but there are no general frameworks for best practice. Great ideas and insights are spread across a wide range of books and fields (intelligence, investing, medicine, science, history & law) that still need to be incorporated into an overarching framework. Heuer's insights into the process of analysis align with what I have learned so far, and help me to develop my framework further. I found his process of analysis with an emphasis on alternative perspectives very compelling and will try to incorporate them into my process going forward.
1 review
April 26, 2026
Author: [[Richard Heur]]
Type: #source #book
Link:
Topics: Psychology

---

## Chapter 1:
General introduction.
Perception is active not passive.
Perception is reconstruction of reality, not merely recording it. Perception is influenced by
- Past experience
- Cultural values
- education and 2 other factors I cannot recollect
---
## Chapter 2:
Demonstrates how trajectory of information matters
A more ambiguous start leads to less favourable outcomes as more information is needed to directly contradict the wrong perspective.
Longer exposure is also detrimental in a similar way, as longer exposure leads to longer confirmation reasoning by analyst
An analyst may be at an advantage when starting late due to this cognitive bias. Thus, delay giving verdict for as long as possible. Because going against a verdict is resisted by both organisation and individual

One of the pitfalls is analysts see what they "expect to see". This encompasses the notion of they see what they want to see.
In the Image below: if you start from the picture of a man, you are much more likely to classify the
![[ImpressionsResist Change.png]]

Key takeaways:
- When new information comes in, reassess the information in its entirety, not just in increments
- A new analyst may find insights seasoned analyst did not simply because of incremental vs full information assimilation
- It is delusional to think we can analyse with an empty mind. To counter the effects of biases:
- Be as explicit as possible in your assumptions and reasoning. Explicit chain of thought invites questions by self and other analysts to catch errors/fallacies.
---
## Chapter 3 - How to remember what you know

This chapter talks about 3 stages of information flow:
- Sensory Information Storage
- Short term memory
- Long Term Memory

SIS stores information for about a fraction of a second. Its the sounds or sights we get.
STM interprets and stores that information for a few minutes at best unless we try to remember it indefinitely using repeating words. It is very finite in capacity and generally can't be controlled.
LTM is potentially boundless. There are 3 main ways of adding things to LTM: Rote, assimilation or mnemonics. For mnemonics read Tushar Chetwani's Memory book from High School. Assimilation is most interesting. It puts forward the notion that brain has a schema of sorts to relate data. If information comes in which can easily be put in one of these models, it is easy to assimilate. Example is given of chess pieces set up from a game and given to recall to a master and normal player. Master recalls very well. If random arrangement of pieces, not much difference is seen in recall between avg. and master.

Unlearning a habit is also very difficult. More so than making a new one. Similar is the case for throwing away and creating a new schemata.

Factors that influence how well we remember:
- Being the first stored information on a given topic.
- Amount of attention focused on it
- Credibility
- Amount of importance given to it at time of consumption.

Another problem in memory is highlighted. We can at best keep 5-9 in our memory for retrieval at a time. So externalising a problem - breaking it down into components and explicitly marking how each part relates to one another is extremely helpful in maintaining sight of the whole picture.

 These are listed in order of the [[Depth Of processing Information]] required:
 - say how many letters there are in each word on the list,
 - give a word that rhymes with each word,
 - make a mental image of each word,
 - make up a story that incorporates each word.


---
## Chapter 4

The most productive uses of comparative analysis are to suggest hypotheses and to highlight differences, not to draw conclusions.

There are a few ways of Generating and evaluating hypothesis:

- Applying theory: Basic idea is to form ideas. Example: when autocratic countries have influx of foreign ideals, the autocracy falls. Then test this hypothesis if it applies to the current situation
- Situational Logic: What is happening now should lead to this based on decision maker's information, misconceptions and state of mind
- Historical analogy: Grawing similarity between situation today and old situation to find the theorise what will happen next and form optimal policy accordingly.
- Data Immersion: This is data snooping mechanism where in you simply can't have an opinion or hypothesis so you just go around look into the data and form a hypothesis rather than form the ht and prove it wrong.

Tips: It is generally a good idea to actually try disproving your theory like the scientific method rather than focus on proving it. This is because of the diagnosticity of results: Temprature of body tells you are ill, not what you have and has very little information otherwise.
One false data is enough to discredit a hypothesis many times, so seek them. Example:
what is the sequence rule : 2-4-6; might say increasing even. but can actually be increasing numbers in general. Can't know unless 1-2-3 is tested. Search Wason Experiment for more info.

Ideally, full set of hypothesis and testing is beneficial but generally unrealising.
### Satisficing: first good enough appearing hypothesis
Thus, analysts use this to cull the herd. Generally, analysts organise data looking to prove this thesis. This is very dangerous as discussed above and has 3 weaknesses:
- Selective Perception: You see what you expect
- Failure to generate good hypothesis
- Failure to consider diagnoticity of evidence

The essence of this is, negative proving information is discarded more frequently with this approach and often has more information than the conforming one.
## "no conforming instance of a law is a verifying instance, but that any disconforming instance is a falsifying instance."

---
## Chapter 5: Do You Really Need More Information?

The author lays out the idea that more information "rarely" improves decision outcomes or analysis. This is only after a threshold information is achieved.
More information -> gives more confidence to analyst and reduces second guessing even when they are wrong.

There are a few cases where new information actually changes analysis:
- Numerical change to one of the independent variables
- Additional detail about a variable already in analysis
- Identification of additional variables
- Information relating to what variables may be important (Example: if I want to think of US policies and am considering Kamala Harris state of mind vs Trump state of mind. Kamala Harris doesn't matter if Trump is president)


People Tend to think of their reasoning process as far more complex than it actually is, overestimating the utility of underused variables and underestimating the effect from a small subsets.

---
## Chapter 6 - Keeping an Open Mind:

Ideas presented include:
- How to break out of a mental rut: speak out the purpose. speaking and thinking activate different parts of your brain. Disengage for a while
- Do not build psychological barriers: often times we create our own imaginary boundations
- Adopt the idea of the devil's advocate to find flaws in an analysis
- True diversity is actually different thought processes. Groups are often good for ideation. This helps us get out the general ideas at once and move on to more creative ones. A diverse group has the following benefits: better critiquing from more angles. Quickness of basic ideas exhaustion.


- QUANTITY leads to QUALITY.

- Organisational structure is shown to provide more relevance with innovation than even creativity.

## Chapter 7: Structuring analytical problems

- Always helpful to write things down
- For example try to see multiplication of 46x78 in mind and on paper
- Even Joseph Priestly and Ben Franklin had a conversation about how decision making is aided by writing pros-cons to have a full picture always available
- A matrix example is given for deciding on car purchase:
- First Attribute to the factors, how important you feel each is such as price, style, operating costs (weight each from a pool of 100)
- Then divide amongst the cars a score of 10 or 100.
- Multiply component wise and sum for each car.
- Choose the one with the highest output.

## Chapter 8: Analysing Competing Hypothesis

This Chapter provides insights into a proven / tested method for analysing and testing hypothesis

- Creating all possible Hypothesis
- Gathering evidence
- Arrange in a matrix with hypothesis at the top, followed by evidence as rows
- Mark +/-/NA in each cell to indicate what the evidence indicates about the hypothesis
- Strike off diagnostically irrelevant evidence and pick some hypothesis that you now see as probable
- Try to disprove it
- Publish report with a set of hypothesis with relative likeliness (more likely or less likely etc.) that are likely with the analytical assumptions about the evidence
- Set milestones and track them to see if model is performing as expected.


Part III on cognitive bias is skipped as most of it is an expansion of the previously mentioned pitfalls such as confirmation bias, observing the expected and so on.
46 reviews1 follower
March 7, 2017
Easy to read and very thought provoking. Lots of the behavioural biases discussed are well understood but few books put them into a practical framework like this.
Profile Image for Peter.
230 reviews23 followers
December 28, 2018
Originally published as a series of articles from 1978-86, this book feels extremely contemporary, and makes you wonder...why are all of these pop science books so surprising!

This book was targeted at CIA intel analysts, but has fairly wide applicability to anyone who is trying to figure out how the world works, and especially for those with the humility to appreciate the limits of their ability to do so. It's a quick read, probably one of the most consumable and practical handbooks about cognitive biases and our natural weakness for statistics.

One of my favorite concepts was the idea of a model as a mechanism for learning: codifying a model that disagrees with your intuition often means that your intuition is pricing some variable you haven't consciously thought about. Thus, model development is framed as a problem-solving technique, rather than a technique to guarantee a solution!
73 reviews8 followers
June 14, 2020
5.0/5.0

Psychology of Intelligence Analysis is one of those rare books which over-delivers what its title suggests. A more appropriate title would have been Psychology of Analysis, because so many of the topics discussed in this book transcend the field of national intelligence. IMHO, chapters of this book should be required reading for all freshmen in all colleges. It is an excellent resource to help us think about how we think.

I got introduced to this book from a reference in another much more recent but much more stupid book (Becoming Kim Jong Un) written by another CIA officer. Psychology of Intelligence Analysis is concise, well-researched, well-written, well-referenced, and convincing. I will not do it justice by attempting to summarize it. But I have learned more per word count from this book than most (perhaps any) other books I’ve read.
Profile Image for Andrew Carr.
481 reviews121 followers
June 17, 2021
As the name suggests, this is a textbook for intelligence analysts, but it is also a very useful guide to trying to think clearly about the world and make judgements.

Published long before books like 'Thinking Fast and Slow' and the emphasis on cognitive biases, this book directly looks at the way analysts think about and judge the world, from their memory to habits of mirroring or lacking clarity in meaning.

What's more, while the recent trend seems to be to just highlight the fallacy of human thinking, Heuer has a practitioner's concern with how to overcome the problems. Again the context is specifically intelligence analysts, but there are many useful ideas for how, particularly in cooperation with others and with careful review, we can attempt to see how these issues may be shaping our analysis.

Recommended.
Profile Image for bumbu.
28 reviews2 followers
April 27, 2019
Overall interesting but very hard/boring to read.
Even thought the book is focused on intelligence analysis, one can adapt the same approaches described in the book to other domains. As the book focuses a lot on how humans perceive information and build their own versions of truth - this information can even be used to better understand judgements made by ourselves and those around us. It could also probably be used to persuade others to believe certain truths by using exactly the same techniques.
The last chapter of the book summarises the book pretty nicely - so by just reading that, one can grasp the main idea from the book.
Profile Image for Scott Holstad.
Author 132 books103 followers
January 26, 2020
Excellent text. I think it's foundationally solid, but because things constantly remain fluid, I wish this were updated on a more consistent basis. Otherwise the book ceases to be relevant or little more than a history text. Still, definitely recommended for those studying intel analysis. Required reading and remaining respect for the author and this resource.
Profile Image for Tom.
388 reviews33 followers
November 13, 2010
An excellent discussion of how mental models influence individual analyses along with recommendations for overcoming them and reducing their influences. IT is well supported by examples and summaries of experiments from many different fields.
46 reviews3 followers
July 22, 2019
این کتاب با عنوان «روانشناسی تحلیل اطلاعات» توسط وزارت خارجه ترجمه و چاپ شده و کتابی فوقالعاده است با موضوع خطاهای شناختی تو تحلیل اطلاعات و این که چطور باهاشون کنار بیایم. کتابی هست که بهتره ازش یادداشت بردارید تا بهتر بفهمیدش
Profile Image for Mike.
Author 8 books91 followers
October 19, 2011
This is a very good book about the difficulties associated with acccurate intelligence gathering and analysis. The book was a textbook for training CIA agents.
Profile Image for Henry.
79 reviews5 followers
February 21, 2012
This is a good tool for the beginning intelligence professional regardless of speciality. It is also good for more experienced professionals to read as a refresher.
Profile Image for Samantha.
23 reviews
June 4, 2015
A long, wordy, yet necessary read for anyone who wishes to pursue any further into the study of "Intelligence analysis".
Profile Image for Don.
389 reviews
June 29, 2024
This book takes work. Practicing the ideas in the book and implementing the processes will make you better, no matter your vocation.
Displaying 1 - 30 of 115 reviews