In Streetlights and Shadows, Klein argues that the traditional claims for thinking and making decisions only apply in well-ordered situations (under "streetlights"). In situations that are ambiguous, complex and unpredictable (in the "shadows"), however, these processes are at best useless and at worst, work against us. He identifies what he considers to be the ten most worrying of the common pieces of advice given by researchers, organisational developers and management specialists on decision-making, then systematically unpacks why these claims are ineffective and what we might do instead.
I'd just read Steven Johnson's Farsighted: How We Make Decisions That Matter the Most before picking up Streetlights and Shadows (they were on the same shelf in the library). Klein's book, apart from being more tightly and convincingly written, pretty much tells you that Johnson's advice, while not unhelpful, has its limits. In a complex situation, Klein would argue that things like full spectrum analysis and influence diagrams can be helpful but we shouldn't hold on to them too tightly.
Claim #1: Teaching people procedures helps them perform tasks more skilfully.
Klein argues that procedures are only useful in well-ordered situations when they can substitute for skill, not augment it. Procedures are also useful as training tools to help get novices started in learning a task, as memory aids, or to impose consistency for teams, particularly ad hoc teams. In complex situations, however, procedures are not a substitute for experience. For these, Klein suggests that "people need judgement skills to follow procedures effectively and go beyond them when necessary", e.g. senior pilots overriding standard procedures when they see fit. Indeed, emphasising procedures over skill might lead to a situation where you get consistently mediocre performance, as there is no incentive for people to try something different and potentially better.
Claim #2: Decision biases distort our thinking.
Klein notes that people use heuristics that are generally effective but aren't perfect. His sense is that these biases are not as problematic in real world settings, compared to experimental findings. Rather than discouraging people from using heuristics, we should "help them build expertise so they can use their heuristics more effectively. Putting judgements into perspective, such as by using the premortem technique, and representing data in ways that support intuition (e.g. using frequency data rather than probabilities), can also help.
Claim #2a: Successful decision makers rely on logic and statistics instead of intuition.
In a similar vein, Klein argues that it's not an either/or situation where we should only use logic and statistics or intuition. Rather, we need to "blend systematic analysis and intuition".
Claim #3: To make a decision, generate several options and compare them to pick the best one.
Klein argues that in reality, we rarely use formal methods of decision making, comparing and weighing different options. It requires too much time and effort. Rather, good decision makers use their experience to recognize effective options and evaluate them through mental simulation.
Claim #4: We can reduce uncertainty by gathering more information. Too much information can get in our way.
Klein makes a distinction here between puzzles and mysteries. While gathering more data can help us to solve puzzles, mysteries require more sense making than data. In complex situations, more data actually increases uncertainty, as the marginal value of additional data points falls while increasing complexity. In cases like Pearl Harbor, 9-11 and Enron, decision makers ignored weak signals and instead explained them away when these didn't fit in with their mental models. "The data was there but the sense making broke down." Instead of devoting resources and energy to collecting more information, we should pay attention to how we sort that information, share and integrate it for sense making.
Claim #5: It's bad to jump to conclusions - wait to see all the evidence.
Klein argues that keeping an open mind makes us passive, by waiting to see what other options and information might surface, rather than engage in anticipatory thinking. . This claim makes us "slaves to the flow of information" and together with Claim #4, can paralyse the decision maker. Instead, Klein argues, we should speculate and test our speculations instead of committing to them.
Claim #6: To get people to learn, given them feedback on the consequences of their actions.
Klein says that feedback is difficult to understand in complex situations and it doesn't help if the learner doesn't understand its implications. What happens if we're not getting feedback that is helpful to us, for instance receiving outcome feedback instead of process feedback? Or if people don't notice and attend to the feedback? Or if it's challenging giving feedback because pertains to tacit knowledge? It is not sufficient to give feedback but we have to also find ways to make it understandable for the learner.
Claim #7: To make sense of a situation, we draw inferences from the data.
Klein points out that drawing inferences from the data is only meaningful if (a) we know what to look out for and what counts as a meaningful data point; and (b) our inferences allow us to form a coherent story to explain events. Our frames determine what counts as data and an experienced expert can draw useful inferences because he knows what to look out for to create his data set, compared to a novice.
Claim #8:The starting point for any project is a clear description of the goal.
Again, Klein points out that this claim only holds water in stable situations. In a complex situation where there are many unknowns, rather than try to have absolute clarity at the start and suffer from "goal fixation", we should redefine goals as we go along. Things like Gantt charts for complex undertakings are misleading in promising that adherence to a predetermined schedule is possible and a waste of resources.
Claim #9: Our plans will succeed more often if we identify the biggest risks and then find ways to eliminate them.
Klein's research indicates that executives don't bother to conduct formal risk analyses but instead try to envision specific scenarios and whether the worst possible outcomes were tolerable. Klein advocates that instead of formal risk analyses, we rely on resilience engineering - organising to anticipate, learn and adapt.
Claim #10: Leaders can create common ground by assigning roles and setting ground rules in advance.
Klein points out that while you can create common ground at the start, this common ground rapidly erodes as the situation changes. Common ground therefore has to be repeatedly re-established and all team members should be responsible for continually monitoring the common ground for breakdowns and repairing it when necessary.
In his book, Klein draws on a wide range of research and cases - from firefighters to weather forecasters, healthcare professionals to pilots and military personnel - to illustrate his points. A fascinating and thought provoking read.