Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves. - Richards Heuer
The CIA published this and made it open source so everyone could understand the basics of skills that go into intelligence analysis, whether it’s analyzing open source info or human intelligence (HUMANINT). It's essentially a guide on how to structure your thinking when dealing with difficuly questions that must be answered.
The word tradecraft refers to the trade and the craft of collecting intelligence, starting with the intelligence collector (the spy) and ending with the intelligence analysts who analyze the information in the context of what they know about the culture, the sources and the current situation (politically, economically, in the context of current events). This guide is an intro into what the analysts do.
Unlike other fields of inquiry like science, there are additional challenges, like intentional deception (either from the collector or the source of the information). There is always potential for additional noise, cultural misreadings, bluffing, underestimating or overestimating threats. This field is full of landmines, and that's why you need to know how to grapple with sensitive and sometimes contradictory information to come to some set of predictions. Often it comes down to not one conclusion, but to several possibilities (the most likely ones) after the less likely ones have been throughly analyzed and discounted.
I love tradecraft strategies because they are so applicable to so many fields and to life in general, but surprisingly there are not that many books/ articles / videos out there about it. Even the tradecraft Reddit page is a ghost town. However one active source of tradecraft knowledge that I like, is this tradecraft Substack newsletter: https://substack.com/profile/45324145...
So, the tradecraft primer outlines strategies dealing with three main ideas: Diagnostic techniques involve making analytic arguments and gaps more transparent. Contrarian techniques involve challenging your assumptions. Imaginative thinking is about developing new insights. All of these techniques should improve the clarity and credibility of intelligence assessments.
Some highlights of the techniques I find most useful:
Diagnostic techniques:
Key Assumptions Check - identify all the assumptions you have about a situation and assess them. How much confidence exists that this assumption is correct? What explains the degree of confidence in the assumption? What circumstances or information might undermine this assumption?
Example: finding an active shooter. You assume it's a male, white, acting alone, riving a van. Once you know each key assumption you will be sensitive to new info that contradicts it. Keep only he strongest assumptions.
Quality of information check: Check the validity of information including the circumstances under which it was collected and in the case of HUMANINT, the background of the source as well as their motivation for providing information. Check if info is corroborated, re-examine dismissed info, caveat ambiguous info properly, indicate a level of confidence for all sources of info.
This way you can detect deception, intelligence gaps, and the confidence level of the overall analysis. Checking the quality of info is an ongoing process, and quality is much greater than quantity when it comes to collecting intel.
Indicators or signposts of change: Watch how events develop, and know I advance what events would be a sign that things are moving in the expected path or the unexpected path. Basically, analysts always consider the many possible paths that the future can take (ie the Soviet Union will get stronger, or weaker, will expand or will collapse). There are different likelihoods assigned to these paths, but it can be the the unlikely path is the one that ends up happening. You have to know what to look for, what signs are indicative of any given path occurring, and watch them, note them, reassess your predictions as needed and re-assign likelihoods when enough signposts occur.
Analysis of competing hypotheses:
People are sometimes biased toward first impressions, cherry picking evidence to fit a pre-existing hypothesis and similar things. This strategy aims to overcome this: create a matrix of all competing hypotheses - then load already collected information into the matrix, then work to create a likelihood score for each hypothesis.
Focus on disproving hypotheses rather than proving one. Evidence is more diagnostic if it supports only one hypothesis and not multiple. However also consider: if an individual piece of evidence turns out to be wrong - how much does this shift the likelihood of a hypothesis being true?
Ask what evidence is not being seen but would be expected for a given hypothesis to be true. Is denial and deception a possibility?
Contrarian techniques:
High Impact - Low Probability Analysis - consider the events that are low probability but would be seriously impactful. This occurs in real life, world events no one expected to occur do occur and shake the world (the fall of the USSR, German unification, the fall of the Shah, the 9/11 bombings were all considered low probability at the time that they occurred). Define the outcome and then possible pathways that would lead to it, including possible triggers (like natural disasters, the death of a leader, economic and political shocks). Identify each pathways and a set of "observables" that you can track in the future as events unfold. Identify factors that would deflect these outcomes/ reduce their likelihood / promote a positive outcome instead.
This way you will notice signposts of an unlikely event occurring, even if it goes against expectations. You need to be able to counter your own prevailing mindset.
Very short introduction to the craft of intelligence work via the CIA.
I highly recommend this for any writer looking to write mysteries. Parts of it read like a roadmap for Sherlock Holmes or Agatha Christie (or Batman), but there were lots of great hints on how to mess with the reader, too, like first suggesting a wrong hypothesis based on incomplete data because it would take the reader longer to find the right one, even if they didn't agree with it!
Recommended if you like mysteries. Plus you can do a search on the title and get it for free from the CIA website.
An excellent book that reveals common flaws in analysis and, unlike many similar such works, actually provides actionable methods of reducing the frequency of such flaws.
Richards Heuer was a CIA officer for c.45 years and this book is the distillation of a lifetime's experience in intelligence analysis. It is clear, simply written and provides ample evidence for its main claims. More experienced students of behavioural bias will be familiar with several examples provided in the latter half of the book, where Heuer details availability bias, anchoring and more. With that said, Heuer's more contemporary approach and genuine insight into how to prevent these biases makes reading this worthwhile for any aspiring analyst.