Human error is cited over and over as a cause of incidents and accidents. The result is a widespread perception of a 'human error problem', and solutions are thought to lie in changing the people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label 'human error' is prejudicial and hides much more than it reveals about how a system functions or malfunctions. This book takes you behind the human error label. Divided into five parts, it begins by summarising the most significant research results. Part 2 explores how systems thinking has radically changed our understanding of how accidents occur. Part 3 explains the role of cognitive system factors - bringing knowledge to bear, changing mindset as situations and priorities change, and managing goal conflicts - in operating safely at the sharp end of systems. Part 4 studies how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in many different fields of practice. And Part 5 tells how the hindsight bias always enters into attributions of error, so that what we label human error actually is the result of a social and psychological judgment process by stakeholders in the system in question to focus on only a facet of a set of interacting contributors. If you think you have a human error problem, recognize that the label itself is no explanation and no guide to countermeasures. The potential for constructive change, for progress on safety, lies behind the human error label.
I haven’t learned much about human factors, but as someone who works on devices and the software that runs on them I often consider how I should be thinking about the people who use what I make and what outcomes their interactions might lead to. I thought this book’s reframing of “human error” as a mere attribution was really powerful, and I enjoyed the book’s logical organization and powerful examples. Even though the examples were generally more life and death than anything I ever deal with, I can still appreciate the applications to my work and life.
That being said, the book was pretty dense and I don’t think I’d recommend the whole thing to anyone unless they really liked very specific examples, so I’m giving it four stars.
Most of it already well known to the HF community but I still find tidbits of new info here and there. This book is mainly written to the practitioners outside of HF community. The message is important, although the barriers to overcome is overwhelming.
The ideas in the book are very interesting, and the content is clearly well researched, but the writing style is very dense. This is not the most approachable book about the topic, but if you stay with it you will learn quite a lot about how failures and safety in complex systems.
A must for all 'so-called' human factors experts. Clear succinct and communicated with such ease it makes you wonder why everyone doesn't just think this way in the first place!