A phenomenal resource on rationality and decision-making – perhaps the best in its genre. The title undersells the book. Despite coming out of the CIA, the book contains lessons that apply far beyond intelligence analysis; I think it is a necessary read for anyone who wants to do intelligent analysis. Furthermore, it is not just about psychology – it is about the philosophy, sociology, and practicality of intelligent analysis.
The core premise of the book is that people make analytical judgments by processing sensory information through their cognitive machinery, without understanding the weaknesses of either the sensory processes or their cognitive machinery. Heuer seeks to ameliorate this by giving a concise overview of our perception and memory systems, with an emphasis on their pitfalls in the context of analytical work. After all, we presumably did not evolve on the savannah to piece together the motives of foreign nation-states.
Building from there, Heuer gives a guided tour of the required competencies of an analyst (e.g., creativity and open-mindedness), why we are generally deficient in these areas, and practical tools for improvement. This involves some epistemological detours, of a similar flavour to Sowell’s Knowledge and Decisions (which, I should say, I still haven’t finished!). I particularly enjoyed the exploration of how exactly analytical judgments can be generated, for example, using historical analogy is philosophically different to drawing on theory, but both can be valid depending on the situation.
One of the highlights of the book is the Analysis of Competing Hypotheses (ACH) framework, a simple (but not simplistic) tool for deciding between various hypotheses, which has been deliberately designed to offset various cognitive biases. Speaking of cognitive biases, the survey herein is par excellence – they are grouped by category (perceiving evidence, judging cause and effect, estimating probabilities) and Heuer finds the perfect balance of psychological background and practical exposition. As one would expect from a handbook aimed at time-constrained decision-makers, the book is exceedingly well structured and crystal clear (making Thinking, Fast and Slow feel clumsy by comparison)
It’s no surprise that Psychology of Intelligence Analysis is highly recommended in trading/investing circles – I can’t find a concept in the book that isn’t relevant to the role, and its influence is clear in other great resources like Geopolitical Alpha (whose Constraints Framework is a modified version of ACH). Really, all one would need on top of this is a similar book about the philosophy and practicality of statistical modelling (a combination of Modelling Mindsets and Regression Modelling Strategies in conjunction could get you most of the way there, but I’m still on the lookout for the definitive text).
I guess after reading all these books on decision-making and rational thinking, I’ve realised it does just boil down to what the Greeks had chiselled into the temple at Delphi – ”Know Thyself”. Richard Heuer’s book is a decisive step towards that goal!
Key ideas
- Core thesis:
- We do not spend enough time questioning our thought processes
- Our minds are naturally ill-equipped to deal with analysis
- Sensory input is mediated by cognitive processes
- Perception:
- People perceive what they want to perceive
- More information is required to perceive an unexpected phenomenon
- Incremental change is harder to notice
- Initial uncertainty interferes with accurate perception even when better information becomes available, e.g. if an image starts off as blurred and is gradually unblurred, it’s much harder for us to recognise it.
- Memory:
- Sensory information storage (SIS):
- Extremely short-term snapshots held for O(0.1 seconds). This is why 24fps looks smooth to us.
- Gives the brain time to process the stimulus
- Short-term memory (STM):
- Stores the interpretation of sensory information for 1-100 seconds.
- Severely limited capacity: e.g. hard to both take notes and think
- Long-term memory (LTM): the body of knowledge and experience we can draw on.
- Control mechanism: decides which information goes to which memory. Little is known about it.
- Retrievability depends on number of locations where info is stored and strength of pathways to other connections.
- Methods of learning:
- By rote: forms a separate schema, unrelated to existing schemata
- Assimilation: new information links to existing schemata
- Biases in memory:
- Easier to remember something if we have a category for it
- Hardening categories: different information can get stored under the same category, leading to analytical errors
- Memory rarely changes retroactively: memories seldom become overwritten in response to new information.
- Improving creativity:
- Defer judgment: separate idea-generation from idea-evaluation.
- Quantity leads to quality: the first ideas will be first-order, later ideas may be more nuanced.
- Cross-fertilisation
- Alternate between individual and team thinking.
- Strategies for analytical judgment:
- Situational logic: try to understand the unique facts and logic of the particular situation.
- Tracing cause-effect or means-ends relationships
- Weaknesses: can be hard to understand the mental processes of other institutions; fails to exploit theoretical knowledge derived from other cases.
- Theory: apply generalisations that were formed from the study of many examples of a given phenomenon.
- Historical comparison: “Unknown elements of the present are assumed to be the same as known elements of the historical precedent”
- Analogy often suffers from availability: the first analogy comes to mind. Need to test fitness and try to disprove the validity of analogies.
- Comparison is useful because it can highlight other variables that are not apparent in the current situation
- Must identify if the problem calls for data-driven or conceptually-driven analysis:
- Data-driven analysis: if the analytical model is correct (and being correctly applied), then accuracy depends purely on the quality of the available data. e.g. analysing military combat-readiness
- Conceptually-driven analysis: questions are more poorly specified, more unknowns and unknown unknowns. Many variables could be relevant; poorly understood relationships.
- Antipatterns for analytical judgment:
- Satisficing: selecting the first hypothesis that is good enough
- Incrementalism: focusing on a narrow range of alternative hypotheses
- Consensus: choosing the thing that will appease everyone
- Fitting to the past: choosing the alternative that would have best avoided the previous mistake
- How we use information:
- We don’t have a good idea of which variables we are using to make a decision. Models fit to analyst decisions post-hoc can be better at explaining the decisions than verbal explanations from the analyst!
- We overestimate the amount of information used to make a decision
- After getting the minimum amount, more information doesn’t improve accuracy, but it increases confidence.
- Open-mindedness
- In historical intelligence failures, people have overweighted strategic assumptions (e.g. in WW2 the assumption that Japan would not invade) over tactical indicators.
- Need to break out of mental ruts to spin new paths in memory
- Suggestions: work on something different; talk out loud to activate a different part of the brain.
- Premortems
- Analysis of competing hypotheses (ACH):
- Identify all hypotheses to be considered:
- Brainstorm widely, with a group of analysts
- Be careful screening out examples: distinction between disproved and unproven.
- Make a list of evidence/arguments/factors/judgments relevant to the situation. Put the evidence in rows and the hypotheses in columns.
- For each piece of evidence, analyse the diagnosticity across the hypotheses (i.e. how it changes the relative likelihood).
- Delete nondiagnostic evidence.
- Draw tentative conclusions about the relative likelihood of each hypothesis, then proceed to try and disprove them.
- Analyse sensitivity to pieces of evidence: what if that evidence were wrong or misinterpreted
- Report conclusions: relative likelihood of hypotheses, and future datapoints that affect the interpretation.
- Cognitive biases in interpreting evidence:
- Vivid/concrete information is given more weight over statistical/abstract information
- We don’t take absence of evidence into account (”the dog that didn’t bark”)
- Oversensitive to consistency, undersensitive to reliability. E.g. we prefer hypotheses that explain more of the facts, even if all those facts are correlated.
- Impressions remain even when evidence has been completely discredited
- Cognitive biases in perceiving cause and effect:
- We fit patterns to random noise
- We assume that causes are similar to effects (large effects must have had large causes; economic causes have economic effects).
- We tend to ascribe too much weight to the behaviour of individuals, vs the situational determinants of their behaviour. (see Papic’s constraints framework)
- We don’t have an intuitive understanding of the amount of evidence required to prove a relationship