Jump to ratings and reviews
Rate this book

Drift into Failure: From Hunting Broken Components to Understanding Complex Systems

Rate this book
What does the collapse of sub-prime lending have in common with a broken jackscrew in an airlinera s tailplane? Or the oil spill disaster in the Gulf of Mexico with the burn-up of Space Shuttle Columbia? These were systems that drifted into failure. While pursuing success in a dynamic, complex environment with limited resources and multiple goal conflicts, a succession of small, everyday decisions eventually produced breakdowns on a massive scale. We have trouble grasping the complexity and normality that gives rise to such large events. We hunt for broken parts, fixable properties, people we can hold accountable. Our analyses of complex system breakdowns remain depressingly linear, depressingly componential - imprisoned in the space of ideas once defined by Newton and Descartes. The growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have gotten ahead of our theories. We are able to build things - deep-sea oil rigs, jackscrews, collateralized debt obligations - whose properties we understand in isolation. But in competitive, regulated societies, their connections proliferate, their interactions and interdependencies multiply, their complexities mushroom. This book explores complexity theory and systems thinking to understand better how complex systems drift into failure. It studies sensitive dependence on initial conditions, unruly technology, tipping points, diversity - and finds that failure emerges opportunistically, non-randomly, from the very webs of relationships that breed success and that are supposed to protect organizations from disaster. It develops a vocabulary that allows us to harness complexity and find new ways of managing drift."

234 pages, Paperback

First published January 1, 2011

121 people are currently reading
1843 people want to read

About the author

Sidney Dekker

40 books55 followers
Sidney W. A. Dekker (born 1969, "near Amsterdam"),is a Professor at Griffith University in Brisbane, Australia, where he founded the Safety Science Innovation Lab. He is also Honorary Professor of Psychology at the University of Queensland.

Previously, Dekker was Professor of human factors and system safety at Lund University in Sweden,where he founded the Leonardo da Vinci Laboratory for Complexity and Systems Thinking, and flew as First Officer on Boeing 737s for Sterling and later Cimber Airlines out of Copenhagen. Dekker is a high-profile scholar and is known for his work in the fields of human factors and safety.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
121 (36%)
4 stars
123 (36%)
3 stars
70 (20%)
2 stars
18 (5%)
1 star
4 (1%)
Displaying 1 - 30 of 41 reviews
Profile Image for Gregor Hohpe.
Author 7 books209 followers
October 2, 2021
An engaging introduction into complex systems thinking with a particular focus on failures and systemic drift. It's a great topic, but the book doesn't seen to know whether it wants to be academic (for which it lacks the necessary structure and depth) or populist (for which its wording is too cumbersome and its price to high).

Although I took many useful examples and anecdotes from the book, it it highly repetitive. For example, on page 62 we read that "Richard Feynman, physicist and maverick member of the of the Presidential Commission that investigated the 1986 Challenger accident, wrote a separate, dissenting appendix to the report [...] One [sic] that disturbed him particularly was the extent of structural disconnects across Nasa...". You'll find the exact same sentences on page 104, ironically with the word "issue" inserted to make the second sentence read correctly. For USD 50 we should expect a book that isn't copy-pasted and thoroughly edited.

What I liked:
- quite accessible, an engaging read
- many real-life scenarios and examples
- good source of quote-worthy thoughts that are well expressed

What I didn't like:
- repetitive and overly drawn out. Chapters appear to follow a proven recipe with the author feeling the need to bring everything back to "drift" and bashing Newton.
- it's pricey: at less than 200 pages, US$50 is too much. I might have given 4 stars for half the price.
- Despite good flow, the author loves to inject fancy vocabulary that doesn't add much.
- More illustrations would have greatly helped the explanations, e.g. for the aircraft stabilizer problem. It's surprising to me that a book about complex systems and understanding relationships contains only 3 illustrations, one of which is a formula, and the other two showing the evolution of the same image.
Profile Image for Maria.
4,599 reviews117 followers
July 11, 2019
Why do complex systems, large companies and industries seemingly ignore the signs of failure before it's too late, resulting in large disasters and crashes?

Why I started this book: Working my way thru my Professional Reading list and this book was newly released in an audio format.

Why I finished it: This is only my second book about industrial safety, and preventing industrial accidents. My first was Field Guide to Understanding Human Error also by Dekker. This was the better book, and I was fascinated with the topic. How safe can we make a large organization that operates in a complex and dynamic world over decades with multiple layers of safety regulations, legal jurisdictions and shareholders? I liked the analogy of drifting safety margins... when the system is designed or first started engineers, managers and such demand safety/error margins but as it is used these margins erode. But how can you tell the difference from a critical safety breach and increased efficiency... before it's too late?
Profile Image for Deiwin Sarjas.
78 reviews9 followers
August 16, 2020
Most of the book is a criticism of the common approach to understanding accidents and attributing blame. It describes the problems with the approach from many different angles using well-known human-made disasters as examples. The criticism centers around the ignorance of complexity in looking for root causes, human errors, and broken components.

While one can't help but agree with the criticism, the book offers little to replace the common approach. It's subtitular goal of "understanding complex systems" is limited to introducing terminology and common features of complex systems. The practical suggestions of the book are limited to 1) using complexity-aware stories to talk about accidents, 2) increasing diversity, and 3) making reversible decisions or making decisions reversible.

[edit 2020-08-16]: I think I was too critical in my original review. I've noticed that my thinking about complex systems and especially failure therein is much clearer now since reading this book. I hear in conversation and am able to distinguish reasoning that is ignorant of the behavior of complex systems.
Profile Image for Alex.
44 reviews3 followers
May 31, 2015
For a long time, I was looking forward to reading this book. Based on recommendations from my peers and several online sources, I was hoping to learn about how systems drift into failure.

Now, having read the book, I got some of what I was expecting: analysis of complex systems, new framework of looking at the problems without attributing cause, ways of analyzing safety. However, the book didn't quite meet my expectations:

- Just because it's harder to find the cause for failures in complex systems, does it really mean there are multiple narratives? Just because there's an emergent behavior that we don't fully understand, does it mean nobody can understand it, even if we had the ability to trace it step-by-step? Maybe it's the better understanding we should strive for, not shrugging the shoulders and saying how hard systems can be.
- I'm not convinced that Newtonian way of looking at being able to root-cause anything in our world goes as deep as the author portrays it. It is really that Newton was saying that every result is proportional in strength to the action? Or was he just saying that there is a cause, no matter how sophisticated?
- The language in the book is overly complicated. I had to constantly remind myself to stay focused due to author's way of writing. Also, the book could have been 1/3 of the size if the author didn't go over the same things again and again.
Profile Image for Mary.
195 reviews9 followers
April 27, 2014
Really interesting ideas in this book, though it was a tad too scholarly for me to fully enjoy. It sure did give me a lot to think about, though.
Profile Image for Alex.
845 reviews15 followers
August 21, 2024
'Drift into Failure' is the first book on industrial psychology I've ever read. If I'd read it in high school, I'd probably have gone to a college that offered a major in industrial psychology. That's how fascinating this book is.

'Drift into Failure' investigates how high-stakes, high-precision organizations relax, then relax a bit more, then relax a bit more until catastrophy strikes. To illustrate its points, it uses case studies that are close to my heart: Alaska Airlines Flight 261, the Challenger and Columbia disasters, and many more. I'm well-studies on the aforementioned aerospace disasters, and 'Drift into Failure' gained credibility in my eyes by hewing closely to known fact patterns and drawing interesting new conclusions from there.

To be clear, 'Drift into Failure' isn't simply an exercise in finger pointing. It's an investigation into how systems of systems contribute to catastrophes, and how our desire to find one scapegoat, one point of failure, can blind ourselves to the environments and constraints in which these disasters occur.

'Drift into Failure' is written with authority and clarity. As read by author Sidney Dekker, it's clear and easy to understand at double speed. This is an excellent book.
Profile Image for Joe Kopacz.
70 reviews
August 5, 2025
This may be my favorite Sydney Dekker book that I've read so far. Granted, I've only read "The Field Guide to Understanding Human Error" and "Just Culture", so I'm not familiar with his full body of work.

This book has elements of the other two and yet becomes unique through its discussions of Complexity Theory. The idea that small changes to portions of a complex, complicated system can cause eventual perturbations in other portions or even outright failure of the whole is the central message of the book. Equally important, however, is how Dekker discussed Accountability and how linear, cause-and-effect logic ends up leading many of us to construct blame in discrete places, often on the part of the system that was found "broken".

I found the perspective relevant to my own work in nuclear power. My station frequently defaults to assigning blame and the accompanying "performance management" to the people found responsible for events, even those of more "minor" significance. It's always seemed a bit unfair and unjust to me. After reading this book and reflecting on the lack of recovery and growth at my station, it seems doubly so.

For any one in safety or incident/accident investigation, I'd recommend this book. It's a relatively quick read and Dekker gets his points across with relative clarity. If just for the introduction to systems theory and complexity as a principle, this is worth the read.
Profile Image for Alexandra.
1,043 reviews42 followers
April 5, 2025
I feel like I need a book club for this one - I was left feeling like some of my favorite tools had been debunked but in their place I had gotten very little. I should note that I also use systems thinking for evaluating my systems - in 'wai' or accident. But this was such an indictment of any linear thinking.

“Drifting into failure is a gradual incremental decline into disaster driven by environmental pressure, unruly technology, and social processes that normalize growing risk. No organization is exempt from drifting into failure. The reason is that the routes to failure trace through the structures, processes, and tasks that are necessary to make an organization successful. Failure doesn't come from the occasional abnormal dysfunction or breakdown of the structures and processes and tasks. It is the inevitable byproduct of their normal functioning. The same characteristics that guarantee the fulfillment of the organization’s mandate will turn out to be responsible for undermining that very mandate. Drifting into failure is a slow, incremental process. An organization using all of its resources in pursuit of its mandate…gradually borrows more and more from the margins that once buffered it from assumed boundaries of failure. The very pursuit of the mandate over time and under the pressure of various environmental factors like competition and scarcity dictates that it does this borrowing, that it does things more efficiently, that it does more with less, perhaps takes greater risks. Thus it is the very pursuit of the mandate that creates the conditions for its eventual collapse. The bright side inexorably brews the dark side, given enough time, enough uncertainty, enough pressure… This reading of how organizations fail contradicts traditional and some would say simplistic ideas about how component failures are necessary to explain accidents. The traditional model would claim that for accidents to happen something must break, something must give, something must malfunction. This may be a component part or a person. But in stories of drift into failure organizations fail precisely because they're doing well. On a narrow range of performance criteria; that is the ones they get rewarded on in their current political or economic or commercial configuration. In drift into failure accidents can happen without anything breaking, Without anybody erring, Without anybody violating the rules they consider relevant. I believe that our conceptual apparatus for understanding drift into failure is not yet well developed. In fact most of our understanding is held hostage by Newtonian-Cartesian vision of how the world works. This makes particular and often entirely taken for granted assumptions about decomposability and the relationships between cause and effect. These assumptions may be appropriate for understanding simpler systems. But they are becoming increasingly inadequate for examining how formal, bureaucratically organized risk management in a tightly interconnected complex world contributes to the incubation of failure. The growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have gotten ahead of our theories. We are able to build things whose properties we understand in isolation. But in competitive regulation societies their connections proliferate, their interactions and dependencies multiply, their complexities mushroom.”

“In courts we argue that people could reasonably have foreseen harm and that harm was indeed caused by their action or Omission. We couple assessments of the extent of negligence or the depth of the moral depravity of people's decisions with the size of the outcome. If the outcome was worse… Then the actions that led up to it must be really really bad.”

“Complexity is the defining characteristic of society and many of its technologies today. And yet simplicity and linearity remain the defining characteristics of the stories and the theories that we use to explain bad events that emerged from this complexity. Our language and logic remain imprisoned in the Space of linear interactions and component failures that was once defined by Newton and Descartes.”

“...we may be overconfident that we can foresee the effects because we apply Newtonian folk science to our understanding of how the world works. With this we make risk assessments and calculate failure probabilities. But in complex systems we can never predict results; we can only predict probabilities.”

“Direct unmediated or objective knowledge of how a whole complex system works is impossible to get. Knowledge is merely an imperfect tool used by an intelligent agent to help it achieve its personal goals. An intelligent agent not only doesn't need an objective representation of reality in its head or in its computer model, it could never achieve one. The world is too large, too uncertain, too unknowable.”

“Thus the harmful outcome is not reducible to the actual decisions by individuals in the system; it is a routine byproduct of the characteristics of the complex system itself.”

“Drift occurs in small steps.”

“Post-structuralism stresses the relationship between the reader and the text as the engine of truth. Reading in post-structuralism is not seen as the passive consumption of what is already there provided by somebody who already possessed the truth and is just passing it on. Rather reading is a creative act, a constitutive act in which readers generate meanings out of their own experience and history with the text and with what it points to. As post-structuralism sees it, the author and reader aren't very different at all. Even authors write within a context of other texts… in which choices are made about what to look at and what to not, choices that are governed by the author's own background and institutional arrangements and expectations. The author is no longer a voice of truth in this. Text or any other language available about events or incidents has thereby lost its stability. Nietzsche for example would have been very distrustful of the suggestion that while everybody has different interpretations it is still the same text. No he and post-structuralism in general does not believe that it is a single world that we are all interpreting differently and that we could in principle reach agreement on when we put all the different pictures together. More perspectives don't mean a greater representation of some underlying truth. How does this work in a complex system? More perspectives typically mean more contradictions. Of course there might be some partial overlap but different perspectives on an event will create different stories that are going to be contradictory guaranteed. The reason says Complexity Science is that complex systems can never be wholly understood or exhaustively described.”

“System thinking is about relationships, not parts… Systems thinking is about accidents that are more than the sum of their broken parts. It is about understanding how accidents can happen when no parts are broken or no parts are seen as broken. Which produces a second question perhaps even more fascinating: Why did none of these deficiencies which are now so obvious strike anybody as deficiencies at the time? Or if somebody did note the missed deficiencies then why was that voice apparently not sufficiently persuasive? If things really were as bad as we can make them look post-mortem then why was everybody including the regulator tasked with public money to protect safety happy with what was going on?”

“Jens Rasmussen suggested that work in complex systems is bounded by three types of constraints: there's an economic boundary Beyond which the system cannot sustain itself financially. Then there's a workload boundary beyond which people and Technologies can no longer perform the tasks they are supposed to be doing. And there's a safety boundary Beyond which the system will functionally fail.”

“Few regulators will ever claim that they have adequate time and personnel resources to fully carry out their mandates. Yet the fact that resource pressure is normal doesn't mean that it has no consequences.”

“As a system is taken into use it learns. And as it learns it adapts. A critical ingredient of this learning is the apparent insensitivity to mounting evidence that from the perspective of retrospective outsider could have shown how bad judgments and decisions actually are. This is how it looks from the position of the retrospective outsider. The retrospective outsider sees a failure of foresight. From the inside however the abnormal is pretty normal and making trade-offs in the direction of greater efficiency is nothing unusual. In making these trade-offs however there's a feedback imbalance. Information on whether a decision is cost effective or efficient can be relatively easy to get… how much is or was borrowed from safety to achieve that goal however is much more difficult to quantify and compare. If it was followed by a safe [outcome] apparently it must have been a safe decision.”

“Empirical success in other words is no proof of safety. Past success does not guarantee future safety. Borrowing more and more from safety may go well for a while but we never know when we're going to hit.”

“There's something complex and organic about the [system], something ecological that is lost when we model them as a layer of defense with a hole in it when we see them as a mere deficiency or a latent failure. When we see systems instead as internally plastic, as flexible, as organic their functioning is controlled by dynamic relationships and ecological adaptation rather than by rigid mechanical structures.”

“Self transcendence - the ability to reach out beyond currently known boundaries and learn and develop and perhaps improve.”

“The banality of accidents thesis... Incidents do not precede accidents - normal work does.” --> accidents are a normal byproduct of everyday activities and decision-making within organizations, rather than being caused by isolated failures or mistakes.

“Safety culture for example breaks culture down into what employees experience or believe… as well as the presence or absence of particular components of [safety investments]. It measures those, adds them up, and gets an outcome for safety culture. Together these components are assumed to constitute a culture. Yet it is never made clear how the parts become the whole.”

[simplistic thought says] “There is only one true story of what happened. Not just because there is only one pre-existing order to be discovered but also because the knowledge of that story is a mental representation or mirror of that border. The truest story is the one where the gap between external events and internal representation is the smallest. The true story is the one in which there is no gap at all.”

“Zvi Lanir used the term Fundamental Surprise to capture the sudden revelation that one’s perception of the world is entirely incompatible with what turned out to be the case.”

“ According to the Sequence of Events idea, events preceding the accident happen linearly in a fixed order and the accident itself is the last event in the sequence.”

“The Swiss cheese model got its name from the image of multiple layers of Defense or cheese with holes in them. Only a particular relationship between those holes however (when they all line up) well allow the hazard to reach the object that was supposed to be protected.”

High reliability theory: Leadership safety initiatives (others will never be enticed to find safety more compelling than their leadership), need for redundancy (duplication or overlap), decentralization culture and continuity, organizational learning

“They trust the procedures to keep them apprised of the developing problems and the belief that these procedures focus on the most important events and ignore the least significant ones. Success narrows perceptions. It changes attitudes. It reinforces a single way of doing business. It breeds overconfidence and the inadequacy of current practices. And it reduces the acceptance of a opposing points of view.”

“Thus even though experts may be well educated and motivated even to them a warning of an incomprehensible and unimaginable event can I be seen because it cannot be believed. This place has severe limits on the rationality that can be brought to bear in any decision-making situation. Seeing what one believes and not seeing that for which one has no beliefs are essential to sense making as Wagz [?] says.”

“Mechanistic thinking about failures, that is the Newtonian Cartesian approach, means going down and in. Understanding where things went wrong comes from breaking open the system and Diving down, finding the parts, identifying which ones are broken. This approach is taken even if the parts are located in different areas of the system… In contrast systems thinking about failures means going up and out. Understanding comes from seeing how the system is configured in a larger network of other systems.”

“It is known as recursion or fractals when speaking in geometric terms. Where stochastically self-similar processes operate at different relative sizes or orders of magnitude. Such patterns recognizable at different levels of resolution when studying complex systems are one way of finding some order, some pattern in how complexity works, some way of mapping what goes on there.”

“And then proclaiming that such an organization or crime Syndicate has been dealt with by chopping off its head reduces the problem to what it isn't. Chopping off the head of a complex system doesn't work. It doesn't work at all. It is the logically impossible. After all, its executive intelligence, its central nervous system is distributed throughout the entire system. It is the system that is complex and smart and adaptive, not some omniscient government that can be dealt with in isolation.”

“Complex is not the same as complicated. A complicated system can have a huge number of parts and interactions between parts but it is in principle exhaustively describable.”

“If accidents are emerging properties then the accident proneness of the organization cannot be reduced to the accident proneness of the people who make up the organization. You can suffer an organizational accident in an organization where people themselves have no little accidents or incidents, in which everything looks normal, and everybody is abiding by their rules.”

“Indeed we can ask whether our analyzes and theories of past accidents and disasters tell us anything useful at all for designing institutions with better future performance or whether we are merely left with the observation that complex organizations faced with turbulent environments will repeatedly fail us in unpredictable ways and the only practical advice to risk managers is to stay fully alert to this possibility.”

“Complexity theory has no answers as to who is accountable for drift into failure. Just as the Newtonian view has only oversimplified, extremely limited, and probably unjust answers. What complexity theory allows us to do however is to dispense with the notion that there are easy answers supposedly within reach of the one with the one best method or most objective viewpoint. Complexity allows us to invite more voices into the conversation and to celebrate the diversity of their contributions. Truth, if there is such a concept, lies in diversity, not singularity.”
Profile Image for Jana Rađa.
361 reviews13 followers
September 11, 2025
Drift Into Failure by Sidney Dekker was another spot-on algorithmic recommendation from Scribd, perfectly aligned with my interest in complexity and systems thinking. I read widely across genres, but this book stands out, offering just the right mix of new information and insight. First published in 2011, it clearly presents Dekker’s ideas in a way that is especially accessible for newcomers like me, while distinguishing itself through its strong emphasis on real-world examples and practical applications.

I knew nothing about Sidney Dekker before opening the book, which turned out to be an advantage. I could engage with his concepts without preloaded dogma and see how they resonated with my own experiences and knowledge. Dekker, a professor at Griffith University in Brisbane and former Boeing 737 pilot, is internationally recognised for pioneering ‘Safety Differently’ and ‘Restorative Just Culture’ movements promoting resilience, compassion, and learning in organisational safety. In some safety and human factors circles, he is half-seriously described as a kind of ‘safety anarchist’—a reflection of his deliberate challenge to bureaucratic, compliance-driven, blame-oriented approaches.

Why did I enjoy the book so much? As a newcomer to the field, it challenged many of my assumptions about organisational safety and systems thinking, limited though they were. By the midpoint, I realised that I often approached problems through the lens of complexity and systems without consciously recognising it. This insight deepened my appreciation of Dekker’s work.

Drift Into Failure argues that our understanding of safety is still shaped by a Newtonian–Cartesian worldview: a mechanistic, linear, and predictable model of the world. In contrast, complexity and systems thinking reject a single, definitive truth. As Dekker writes, ‘Truth, if there is such a concept, lies in diversity, not singularity.’ Traditional safety models focus on components and linear cause-effect relationships, whereas complex systems behave more like living organisms—they can fail even when no part is technically broken.

Although there is nothing inherently wrong with using Newton and Descartes as guides for how to think, like any model, their framework can limit our perspective, preventing us from discovering aspects of the world we didn’t even know to question.

The concept of ‘drift into failure’ describes how a system gradually erodes its safety through a series of small, locally plausible adaptations—often unnoticed (Donald Rumsfeld’s ‘unknown unknowns’)—until a harmful or catastrophic outcome occurs. While reading, I realised that drift into failure quietly builds the hidden vulnerabilities that can later erupt as ‘Black Swan’ events, prompting me to revisit Taleb’s masterpiece with a fresh perspective.

Dekker identifies five key concepts underlying drift into failure:
— Environmental pressure: Scarcity, competition, cost-cutting, or contradictory goals (e.g. speed versus safety) create pressures that slowly erode previously established safety margins. For example, NASA’s 1990s motto ‘faster, better, cheaper’ embodied contradictory objectives that made failure more likely. While outsiders may judge this as poor foresight, insiders at the time saw it as normal.
— Small steps (that is, decrementalism): Failures are not caused by dramatic single events but by tiny, cumulative adaptations in how an organisation operates—changes so small they feel normal at each step.
— Sensitive dependence on initial conditions: Borrowed from chaos theory (the ‘butterfly effect’), this means that tiny differences at the starting point can lead to very different long-term outcomes. It is not about incremental erosion over time as in decrementalism, but about how initial conditions set trajectories. Safety certification, for instance, should anticipate such futures, not merely confirm present compliance. Aircraft certification rarely considers lifetime wear of components, even though such wear can eventually make an aircraft unsafe. Nor do certification processes account for the socio-technical adaptations that occur when new equipment is introduced, which can contribute to drift into failure. Adaptation, wear, and fine-tuning are not treated as criteria in certification, nor is there a requirement to establish organisational measures to anticipate or manage them.
— Unruly technology: Borrowing Brian Wynne’s term, ‘unruly technology’ describes the gap between what technology is imagined to be and how it behaves in practice. Drift can occur even when everyone complies with rules and regulations.
— Ambiguous protective structures: Safety systems (rules, checks, regulations) that are meant to guard against failure can become part of drift themselves, as boundaries blur and practices evolve in ways not captured by official procedures. As Wynne observed: ‘Practices do not follow rules; rather, rules follow evolving practices.’

In short, drift is driven by incremental shifts in what people—including regulators and safety overseers—consider normal, shaped by competitive pressures, scarce resources, and routine work. No organisation is immune; success itself can quietly create pathways to failure. ‘The bright side inexorably brews the dark side—given enough time, enough uncertainty, enough pressure,’ Dekker notes.

Dekker emphasizes that complexity is emergent, arising from interconnections within organisations and between organisations and their environment. He cautions that ‘the growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have got ahead of our theories,’ highlighting the persistent gap between what systems can do and what we are able to comprehend or manage safely. This idea resonates with Carl Sagan’s warning in The Demon-Haunted World, published in 1995, about the risks of our growing reliance on complex systems without sufficient understanding: ‘We’ve arranged a global civilisation in which most crucial elements—transportation, communications, and all other industries; agriculture, medicine, education, entertainment, protecting the environment; and even the key democratic institution of voting—profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.’ Yet thirty years later, it feels as though we are still barreling forward with the same blind reliance on systems we scarcely understand.

Dekker, naturally, offers guidance on managing drift:
— Cultivate diverse perspectives to challenge assumptions and reveal blind spots. Bring in outsiders, use people from different operational areas or different backgrounds to provide fresh views. Diversity helps challenge ‘what we think is normal’ and makes it harder for drift to go unnoticed.
— Make small steps visible and reflect critically on incremental changes. Since drift often occurs in small, incremental changes that seem harmless at the time, create mechanisms to notice, talk about, and possibly reverse those small steps. Reflection on what you are normalizing is key.
— Avoid over-reliance on rules or compliance, or counting errors, which can give a false sense of safety. Dekker argues these methods are not sufficient in complex systems because they miss interactions, adaptations, and emergent behaviour. Moreover, traditional protective measures (specialisation, policies, protocols, redundancy) add reliability but also increase complexity. This paradox can make systems less safe.
— Foster honesty, openness, learning, and speaking up. Organisations should create cultures where people can speak up—even dissent—in safety and risk conversations. Openness about what is going wrong (or could go wrong) helps ensure early warning before things escalate.
— Design oversight that evolves with complex systems. Oversight and regulatory structures should not be static or purely compliance-based, especially as complex systems cannot be regulated. Instead, they should ‘co-evolve’ with the organisation—becoming sensitive to complexity, interconnections, interdependencies, changing practices, and emerging risks.
— Build resilience into the system. Emphasize the organisation’s capacity to absorb disturbances and adapt without large performance losses. That means planning for variability, for uncertain initial conditions, and having slack or buffers rather than optimizing every part for efficiency only.
— Focus on successes and near misses to learn from what works. Instead of only analysing failures, look at what goes right and why—what decisions or practices help avoid failure, even under pressure. Near misses and successes can reveal the same tensions and adaptations that lead toward drift, and thus act as early indicators.
— Monitor feedback imbalances and hidden trade-offs, tracking the long-term impact of short-term decisions. Pay attention to what feedback loops are being ignored—especially the cost paid in safety when organisations focus on productivity, speed, cost. Track ‘sacrificing decisions’ (explicit or implicit) that give short-term gain at potential long-term risk.

Two talks I watched reinforced these lessons: ‘Learning from mistakes: No-blame management and accountable teams’ (YouTube, DevOps Talks 2018, https://www.youtube.com/watch?v=5SsFO...) and ‘The human factor: Pursuing success and averting drift into failure’ (YouTube, DDD Europe 2018, https://www.youtube.com/watch?v=9fwJ9...). Dekker’s emphasis on dissent—speaking up to challenge assumptions—is particularly compelling. Even kings, he reminds us, relied on jesters to tell them the uncomfortable truth. Failures, Dekker suggests, are less instructive than successes, because success often hinges on someone having the courage to raise a dissenting voice. Another idea that stood out was his distinction between retributive and restorative accountability. Instead of focusing on punishment after something goes wrong, Dekker advocates for approaches that foster learning, repair, and growth—ways of responding that strengthen teams rather than divide them.

On Sidney Dekker’s website (https://sidneydekker.com/), three films explain his approach to safety using real-life examples. The second film, which shares ‘the stories of three organizations that had the courage to devolve, declutter, and decentralize their safety bureaucracy’, is particularly illustrative of the variety of systems to which Dekker’s principles can be applied.

In conclusion, I’m really glad I came across this book. Dekker mentions in one of his talks that he wrote it in just six weeks, which perhaps explains a few of its rough edges. There is some repetition, which worked fine for me as a newcomer—repetitio est mater studiorum, after all—but I can see how it might test the patience of someone already familiar with the ideas. There are also occasional missing words or leftover revisions, reminders that, in publishing, haste is rarely a friend.

None of this, however, diminishes the impact of the book. Dekker’s exploration of complexity and systems thinking is genuinely eye-opening, and everything he presents resonates with me. Moreover, it has clearly mapped the direction for my further reading.
Profile Image for Kai Evans.
169 reviews6 followers
April 13, 2019
i think his attack on science is guided more by a need to create cheap drama rather than actual analysis. most "cartesian/newtonian" facts we know in science are actually statistical aggregates based on observing complex systems. scientific models _are_ based on systems thinking, there is no great lie in the world. but, i guess, cheap debunking drama generates more book sales.
Profile Image for Shhhhh Ahhhhh.
846 reviews24 followers
December 27, 2018
Overall good book. Heavy, heavy on theory. Super light on application, but I think I understand why. I appreciate this book if the intent was merely to lay out a framework for understanding emergent phenomena as they present in business organizations and dealing with the problems caused by them. It's mostly just a ward against the newtonian-cartesian 'broken parts' view of processes and systems.

So, the short summary is that irreducible complexity does exist. When systems develop in such a way that novel interactions between parts and outside of the system can occur, the results cease to be orderly and predictable. A great example of this from the text was salt. Table salt has none of the properties of either of the chemicals composing it. It won't violently react if you put it in water (sodium) and it won't kill you to consume it (chlorine). It's inert and tasty. It will, however, unlike either parent element, give you hypertension if you consume enough of it (a novel effect from an unpredictable interaction). When looking for causes of problems, or disturbances in the performance of a system, typically the approach used is to find the broken part. For the process to break down, a part in the process must have broken down. The author insists on the naivete of this reductionist view. It ignores the imperfections of systems that typically cause ad-hoc rules, procedures and interactions to form. For this reason, in concert with chance interactions and outcomes, no process or system that is complex can be exhaustively described. It cannot be described or understood by any system less complex than itself, and since humans are less complex than the systems we build, it is impossible for any human to exhaustively describe or understand any given complex system. Therefore, the cartesian/newtonian rules don't apply. We can't predict outcomes in a system we can't understand. We also, generally, have difficulty predicting in situations of non-linearity, which is how complex systems behave. Attempts at controlling outcomes in these systems, in business organizations for example, fail to grasp this and, as such, often fail to prevent negative outcomes that are generated solely from novel interactions.

The author also describes "drift to low performance" structures without ever actually saying "drift to low performance".

Would not recommend to anyone but systems thinking enthusiasts heavy into abstract theory.

Profile Image for John.
Author 3 books7 followers
September 20, 2020
This is in some ways a better book than 3 stars might indicate, and I'd suggest it's a good book to read for understanding many problems that afflict organizations (and countries). So if you're into the subject probably you should read it.

_But_ it meanders a good bit. He's pointing to an important concern, which is the spaces outside our mental filters, but it seems to be like he doesn't know about enough tools to get at it more deeply, and to fill space, he side tracks a fair bit on topics that seem tacked on. Those side tracks aren't unrelated exactly, sometimes there's interesting points to them, but they always feel like a strange visitor who butted into a related discussion because of their interest in a minor point.

Also, a general trend here with some systemic thinkers is to outright reject other modes. Now he points out the value of those modes, and he is pitching a new method, but sometimes the solution isn't people mining Coltan in Africa, but something far more immediate and simple. And sometimes people are just bad (although systemically we should think how those people got into those positions and got promoted -- the immune system as it were, and from that the auto-immune system). A good discussion of how these two balance in healthy fashions would have been found a good place here -- frankly, some discussions about systems can end up going nowhere due to the unbounded nature of a complex system, when at times the solution is far simpler. Except when it's not.

The ethical note he ends with is a good one, and one not resolved in philosophy, but it can also be viewed as meaningless, depending on the philosophical assumptions one makes. That's probably a book in itself. In fact, while I haven't read it, I believe he's written a book on Just Culture related to that.

So, read it if you're into complex systems and organizational behavior, but it could have been tighter and sharper.
Profile Image for Piet van Dongen.
71 reviews1 follower
September 21, 2021
I read this after I saw Adrian Cockcroft recommend it, and I wanted to learn more about resilience engineering.

I really appreciate the stories and theory, and especially the kind of open-endless or academic viewpoint of the book. I was looking for a kind of rigid theoretical framework or maybe some more definite practical pointers, but I guess the fields of safety science and related ones are still more rooted in science than in engineering. I got the overall message though, and it certainly made me more knowledgeable, which is a plus for any book.

What I liked less, is the way this book is written. It’s quite uneven, and I get the feeling the author needs way too many words to make a point. The last chapter is a bit out of place in terms of style and content as well. No big problems, but it would have been a much better book if it was balanced and structured better. I have the feeling there hasn’t been too much editing and it was very much a one man effort.

One last point: the examples / metaphor of the resilience of the international drug trade was great and in my case made the theory much easier to digest. Great stuff.

Anyways, I would still recommend this to anyone wanting to know more about safety science, complex systems, socio-technical systems and resilience engineering. Until a better book comes along ;)
29 reviews1 follower
December 16, 2019
Professor Sidney Dekker’s “Drift into failure” looks like an easy read. Don’t be deceived. It was a very dense read and took some time for me to refresh and learn many related concepts along the way - rational choice theory, cybernetics, exegesis and eisegesis, reductionism, dualism, epistemology (to name a few :))

The author makes a great case for using systems thinking and complex theory to analyze large failures instead of leveraging the typical “examine the failed component and fix it” approach (the Newton-Descarte approach).

It was a very interesting read and though the failures mentioned in the book focused on airplane accidents, oil rig explosions and NASA shuttle tragedy, the concepts inspired me to find a parallel in any Corporate as well as in Enterprise IT.

An enterprise is definitely a complex system with all its processes, capabilities, people & technology interacting with each other and different external forces defined by Porter over a period spanning decades.

And you can use the concepts in the book to analyze how it drifts into any behavior. That behavior could a catastrophic accident if the drift is away from safety or it could be into irrelevance if the drift is away from delivering business value.

Very thought provoking read.
Profile Image for Gordon.
642 reviews
April 22, 2019
4.5 stars for content, 4 stars for style and prose. Sidney Dekkar is one of a few new social scientists who specialize in studying complexity or complex systems and how humans attempt to interact with them, understand them, predict their reaction to change...or in actuality...fail to do so. Dekkar explains how complex systems mask variables and the interrelations between them. Hidden or poorly understood connections proliferate, change with time, scale and intensity, and thus cause unexpected reactions...phase changes, tipping points, reverse trends, etc. He does a great job explaining why our Newtonian / Descartes-ian ideas of cause and effect and logic do not apply to complex systems, despite our valiant attempts to make them apply. As the title of the book intimates, he uses multiple well known disasters to explain how companies and organizations slowly drift towards failure as they overlook the changes they inadvertently induce to the norm, creating greater risk imperceptibly as they “normalize” it.
Profile Image for Emanuele Gemelli.
659 reviews17 followers
April 18, 2021
Another fundamental book to read for every safety specialist. A bit of warning, do not expect to find the answer to the meaning of the universe (which is 42, anyway), however, it can open up your mind to explore a more encompassing way to analyse organisations and their emergent relationships. Because there is no direct cause and effect explanation why and how socio-technical organisations work, the book leaves with the feeling that there is not that much to do. My take on this is that we just have to reckon that we cannot always find an unique answer to problems (parts are not broken, so I doesn’t make sense just to change them and the problem will disappear) and that we have to be ready to revisit our own worldview constantly, because relationships within an organisation are ever changing and evolving. It’s complex, pun intended, stuff to read and digest, if you are anchored in the Safety I vision, like almost 90 of my esteemed colleagues in the safety field
Profile Image for Tara Brabazon.
Author 37 books478 followers
October 17, 2024
This is an unbelievably magnificent book. Published in 2011 (!!!), it demonstrates - ahead of time - what has gone wrong in the last 13 years.

Leadership theorists should have used this book. Organizational theorists should have used this book. Higher education researchers should have used this book.

I am a changed human after reading this book.

Dekker explores what occurs when systems are rendered complex. The complexity in and of itself creates the failure. Not individuals. Not a bad decision. So as systems increase in their layers and structures, there is a drift to failure.

Absolutely brilliant.

What makes the book amazing is that the solution to this drift to failure through systems-based complexity is ... diversity. When a workforce is diverse - so a movement away from 'organizational fit' and homology and the Peter Principle - there is a greater chance to diagnose issues and errors. The 'drift' to failure is stopped.

Absolutely amazing book.
34 reviews
July 9, 2023
Not an easy read but some really important concepts and ideas explored and explained.

Plenty to learn about complexity theory and systems thinking as the basis of the phenomenon the author covers of drifting into failure.

Some interesting anecdotal stories as well such as the NASA incidents with the Challenger and Columbia space shuttles, oil leaks and aeroplane crashes.

While the background of the author is an aviation safety professor I found lots of practical takeaways to any business.

If you're interested in learning more about this area (which is what drew me) I would recommend it. I found it opened new ways of thinking for me and enhanced my understanding of complexity.

Be ready to see the term Newotnian-Cartesian about 700 times though...
16 reviews
May 15, 2019
Thought-provoking book about modern accident analysis and how the theory of failure has evolved in the last few decades. He also reviews high-reliability organizations and some of the counter-intuitive strategies that support best practice. The book drags a bit in the final third, but what I appreciated most about the discussion was the philosophical underpinnings that he lays out and the diverse examples pulled from airline, medical, military, and mining industries. It's much clearer to me now how "blameless" retrospectives became best practice and how restorative justice is necessary for a learning organization. Looking forward to reading more by Professor Dekker.
Profile Image for Kay.
146 reviews3 followers
February 12, 2024
This book is short, and full of seriously helpful information for understanding complexity theory. However, I’m downgrading it two stars because while short the book is incredibly dense and difficult to read. I don’t often find myself reaching for a dictionary or re-reading sentences to make sure I grasp the authors intent, but I had to do it every time I picked this book up. Which is why it’s take me almost a year to finish it. Great content, but a difficult read.
Profile Image for Oleksiy Kovyrin.
94 reviews22 followers
January 28, 2019
Great introduction into the theory of complex systems, though I'd like to better understand the practical implications of it and the author only covered those questions in the last 5% of the book. Great read nevertheless, highly recommended to anybody responsible for building or operating large and complex systems.
8 reviews1 follower
April 22, 2019
This is a great read with one caveat: The pacing seems off. Some parts of the books feel too long and repetitive (like Complexity Theory) and some parts feel half baked (drifting into success) or too short (Diversity as a safety feature).

Despite that, the books remains inspiring and thought provoking. If you have read other books from Sidney Dekker, you'll feel right at home.
4 reviews1 follower
July 25, 2019
Give up after about 80 pages. Main complaint - the quality of the printed book. It looks like the publisher wanted to save some trees by pushing as many letters as possible into the page. It really pushed me to my limit of focus when trying to jump from one line to another. And selected font and quality (fuzziness) of the print are not helping.
23 reviews5 followers
February 19, 2020
This book introduces some profound philosophical ideas and complexity and systems theory. Somewhat repetitive though, maybe it's possible to get the same ideas from some other books and papers that are not so long.
Profile Image for Diego Pacheco.
163 reviews10 followers
February 3, 2021
Amazing Book. Love it. It's great for Software Architects, SRE/DevOps engineers, EMs, and honestly all managers should be aware of system thinking, Cynefin, and this book. This book is also amazing for Chaos Engineering. Lots of good examples and rationale why simple Newtonian thinking s broken.
2 reviews
June 17, 2021
Thought provoking but repetitive

This is a great book for anyone looking to understand complex system failures. The book tends towards deep analysis and repetition of concepts over actionable wisdom for your day-to-day job.
Profile Image for Denis Romanovsky.
215 reviews
December 19, 2021
"How a system fails" is a very interesting topic and I did not see much literature on it. The book is not perfect, but it will definitely make you start thinking differently. It does not give just enough answers on how to avoid, but will let you recognize a drift into a wrong direction.
Profile Image for Dennis Cahillane.
115 reviews8 followers
March 5, 2019
A book that gives us no answers, but points out that many common ideas for answers are wrong, highly recommend.
Displaying 1 - 30 of 41 reviews

Can't find what you're looking for?

Get help and learn more about the design.