Jump to ratings and reviews
Rate this book

Normal Accidents: Living with High-Risk Technologies

Rate this book
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.


The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our the Y2K computer problem.

464 pages, Paperback

First published January 1, 1984

189 people are currently reading
4312 people want to read

About the author

Charles Perrow

20 books27 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
249 (35%)
4 stars
266 (38%)
3 stars
145 (20%)
2 stars
29 (4%)
1 star
6 (<1%)
Displaying 1 - 30 of 79 reviews
Profile Image for Andy.
2,079 reviews607 followers
October 10, 2021
This is a classic in its field and an exemplar of an entire genre that seems to be endangered now: serious non-fiction (as opposed to the oxymoronic creative non-fiction). Perrow demonstrates why the usual suspect of "operator error" is not a good explanation of what causes major accidents. His exposition of Normal Accident Theory is too detailed for most readers to dive into in detail, but the general points make sense (e.g. think about systems) and some of the specificities about how safety features can increase danger are fascinating. He also gets a bit into exploring how and why the powers that be maintain the dangers they expose us to as well as the bogus operator error story. Subsequent books on the same topic are better written and more up-to-date but this is the real McCoy and so it's worthwhile if you want to get a feel for the thought processes of someone coming up with an original idea.
Profile Image for Michael Burnam-Fink.
1,702 reviews303 followers
October 7, 2014
Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides numerous examples from areas of technology like nuclear power, maritime transport, chemical processing, spaceflight and mining. What he does not do adequately explain why some systems are to be regarded as inherently unsafe (nuclear power) and others have achieved such dramatic increases in safety (air travel).

Perrow defines complexity as the ability of a single component in a system to affect many other components, and tight coupling as a characteristic of having close and rapid associations between changes in one part of the system and changes in another part. The basic idea is that errors in a single component cascade rapidly to other parts of the system faster than operators can detect and correct them, leading to disaster. In some cases, this is incontrovertible: A nuclear reactor has millisecond relationships between pressure, temperature, and activity in the core, all controlled by a plumbers nightmare of coolant pipes-and there's little operators can do in an emergency that doesn't potentially vent radioactive material to the environment. However, it seems to me that complexity and tight coupling are a matter of analytic frames rather than facts: complexity and coupling can be increased or reduced by zooming in or out. The choice of where the boundaries of a system exist can always be debated. My STS reading group is looking at alternative axis for analyzing systems, but I'd note that the systems that seem particularly accident prone are distinctly high energy (usually thermal or kinetic, or the potential forms of either). When something that's heated to several hundred degrees, could catch fire and explode, is moving at hundreds of miles per hour, or is the size of a city block, does something unexpected it's no surprise that the results are disastrous. And whatever Perrow might recommend, there is no industrial civilization without high energy systems.

One major problem is that Perrow's predictions of many nuclear disaster simply haven't come true. Three Mile Island aside, there hasn't been another major American nuclear disaster. Chernobyl could be fairly described as management error: while an inherently unsafe design, the reactor was pushed beyond its limits by an untrained crew as part of an ill-planned experiment before the disaster. Fukushima was hardly 'normal', in that it took a major earthquake, tsunami, and a series of hydrogen explosions to destroy the plant. The 1999 afterward on Y2K is mostly hilarious in retrospect.

Perrow rightly rails against operator error as the most frequent cause of accidents. Blaming operators shields the more powerful and wealthy owners and designers of technological systems from responsibility, while operators are often conveniently dead and unable to defend themselves in court. The problem is that his alternative is the normal accident-a paralyzing realization that we must simply live with an incredible amount of danger and risk all around us. Normal accidents offers some useful, if frequently impractical advice for creating systems that are not dangerous, but more often it tends to encourage apathy and complacency.

After all, if accidents are "normal", we should get used to glowing in the dark.
Profile Image for Eric_W.
1,954 reviews428 followers
August 14, 2011
8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the problems inherent in tightly coupled systems is certainly timely given the intricacies of the recent financial disaster. Certainly of a tightly coupled system that cause the entire system to collapse when only one component fails.
**
This is a totally mesmerizing book. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. Good companion book for those who enjoy Henry Petroski.

Some quotes: "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries:]; risk assessments ignore the social class distribution of risk (p. 310)." and "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311).

Profile Image for Patricia.
73 reviews15 followers
December 12, 2007
Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that ship). When he heard it was Hazelwood, he was shocked and heartbroken, as they had been good friends. That accident was a recipe for disaster. Everything that could go wrong, did go wrong and at precisely the wrong ("right" for the accident) time. The conclusion I drew after the accident was this: there is no way, no how, no matter how thoroughly and carefully you set up a system to be foolproof, you can't anticipate all the weaknesses left in your system until an accident actually occurs. This book is not exactly about that, but it was somewhat relevant because it addresses "loosely-coupled" vs. "tightly-coupled" systems and how the more tightly-coupled a system is (the more every aspect is controlled) , the more accident prone it tends to be. The Exxon Valdez was definitely a tightly-coupled system. One of the fourteen reasons for the Valdez accident, according to the final 50-page NTSB report, was "Exxon's work rules and overtime policies contributed to employee fatigue and poor morale". Enough said.
Profile Image for Brok3n.
1,451 reviews114 followers
July 25, 2025
Chicken Little was not entirely wrong

I'm struggling to write a fair review of Normal Accidents: Living with High-Risk Technologies. It's what I sometimes call a "I have a new hammer -- look at all these nails!" book. That is, someone has a new idea and writes a book explaining all the ways it applies. These can be very good books, Plagues and Peoples, Private Truths, Public Lies, and The Strategy of Conflict come to mind. Normal Accidents had the potential to be one of these, and if I'm fair, it probably IS one. The problem, though, is that, although it starts off very well, every successive chapter is weaker than the previous one, and the last three are dreadfully tedious. By the time I emerged from the end of the last, I was yelling to Charles Perrow in my mind (as Biden said to Trump), "Would you just shut up, man?" The temptation, given my state of annoyance with the author, is to nitpick the entire book to death, which would not be fair, because it really is based on a very good idea.

That good idea is "normal accident theory" (NAT). It started when Perrow was called to participate in the analysis of the accident at the nuclear power plant at Three Mile Island (TMI). Experts had previously argued that the chance of a serious nuclear reactor accident was very small. Perrow argued that, contrary to being improbable, it was almost inevitable. Not because of the specific combination of things that went wrong at TMI, but because the potential for such accidents was embedded in the nature of the system at a nuclear reactor. He said in 1984, when Normal Accidents was published, that we would have more TMIs. Since then, by my not-well-informed count, we have had worse accidents at Chernobyl and Fukushima, so two points for Perrow.

Chapter 1 is about TMI. Chapter 2 discusses nuclear power plants more generally. Chapter 3 is the heart of the theory the book presents. In it he describes the characteristics of systems that have "system accidents" or "normal accidents" -- complexity and coupling. The next five chapters deal with accidents in different types of systems, 4. petrochemical plants, 5. air travel, 6. marine shipping, 7. dams, mines, 8. space, weapons, and recombinant DNA. At first this was fascinating to a detail-oriented technical type like me. (YMMV, of course.) It was absorbing to read the technical details of how accidents in these different systems came about. They are all different, yet they are all the same, which is Perrow's point. But the examples become weaker as we go along, because we are venturing into systems to which NAT barely applies, or areas where Perrow doesn't know what he's talking about, either because the information is not available to him (nuclear weapons) or he is poorly informed. (Recombinant DNA, "The system appears to be complex in its interactions and tightly coupled, but I caution the reader that I know even less about it than I do about nuclear weapon systems" -- it shows).

Now, I am going to succumb to that temptation to nitpick that I complained of. Feel free to stop reading here. I want to be clear that I am not casting doubt on NAT, but only complaining about the way this book is written. In fact, I'm going (with superhuman restraint -- I hope you appreciate this) to restrict my criticisms to the last three chapters:

9. Living with High-Risk systems. This final chapter contains Perrow's recommendations. He begins by saying, "I have a most modest proposal, but even though modest and, I think, realistic, it is not likely to be followed." This is tantamount to saying he is not going to take his own recommendations seriously. The chapter then devolves into a long, "Why everyone who disagrees with me is wrong" screed. That never goes well.

Afterword. Here he explains that Normal Accidents went out of print, but that Princeton University Press offered him the opportunity to publish a new edition in 1999. The Afterword is mostly an update on normal accident theory. As written, it is a terrible missed opportunity. 1984 was almost 40 years ago, and as you read the first nine chapters you will constantly remark to yourself incidents that have happened since to which NAT might apply. Perrow barely discusses those. Instead, it is clear that he has become the Grand Old Man of an academic field, and he reviews the literature like an academic, not a person who is genuinely interested in understanding accidents. (That's a bit unfair, but it's how it felt to me as I read.)

Postscript: The Y2K problem. The updated edition was published in 1999, when there was a lot of worry about computer systems failing as the date ticked over to 2000. In this postscript, Perrow seems to have recovered his genuine interest in accidents. In fact, the entire chapter has the feel of him chortling in anticipation of the delicious new disasters that will become available to study. As the world now knows, Y2K was almost a nonevent -- there were minor disruptions only. Folks in information technology would like it to be known that this is not because there was no potential for a disaster, but because they worked their tails off to prevent one, successfully. Fair enough!

In summary, Normal Accidents is a good book about an important idea, but it could have been a better book. I'm going to make a recommendation I rarely make: if you read it, read through chapter 8 and skip the final three chapters.

Blog review.
Profile Image for Sarah.
69 reviews
December 12, 2015
Man was this book ever a slog. I had high hopes for it - I have a morbid habit of reading accounts of failure analysis. Plus, I thought I might learn something useful, since I work on a complex system, where (at least we'd like to think) we've done a pretty good job of planning for single component failures, but we do still have potential for unanticipated system interactions. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be good to a get a perspective from outside my field. Sadly, I just found myself wishing the author had consulted a few more engineers. In the early chapters on Three Mile Island (which, by the way, he led off by saying "It is not necessary to understand the technology in any depth"), he'd give a confusing halfway explanation of some aspect of the situation and then just throw up his hands and basically say "it was all just too complicated for anyone to understand." I mean, just because YOU don't understand it, or even if most of the people who talked about it didn't understand it, that doesn't mean that NOBODY could understand it. And don't even get me started on the space section; Tom Wolfe's The Right Stuff is immensely entertaining, but not exactly the best resource to draw on as a technical reference. Plus, Perrow refers to the Houston flight controllers as "middle managers", which misses the point sort of hilariously - there is plenty of middle management at NASA, but that's a poor descriptor for most of the people actually on console (especially back in the days of the space race).

I think part of the issue is that this book just hasn't aged that well. Maybe it was revolutionary in 1984 to suggest that operator error was a convenient scapegoat but not necessarily the whole story, but by now that feels like an obvious baseline assumption. Plus, I'm looking back from the other side of over 30 years of exponential growth in computing power. (Speaking of computers, there's a blast from the past in the 1999 reprinted edition I read; there's a whole postscript about the Y2K problem. Yep.)

I have to admit I don't actually disagree with some of the conclusions Perrow draws; if a technology has a high potential for catastrophic disasters, we should consider whether it's worth pursuing because its benefits are so great and there's nothing else comparable (or because we could make it significantly safer with reasonable effort), or whether we should give it up because it's not that much better than safer alternatives. I even more or less agree with his classifications of various technologies based on those criteria. But I can't help feeling like he got to those conclusions almost in spite of himself. In most of the accidents he described in the original book, as well as the notable ones he touches on in the afterword of the updated edition (Bhopal, Chernobyl, Challenger), the root cause didn't seem to be that the system was just too complicated for anybody to understand. Instead, there were plenty of more prosaic reasons: poor maintenance, insufficient training, management pressure, schedule pressure, etc. It wasn't that we just couldn't anticipate all the dangers; it's that somewhere along the line, people made choices that made things less safe, usually because of external pressures which seemed to have more certain and immediate negative consequences, and maybe sometimes because the people making the decisions weren't directly in the line of fire of the catastrophic failure. Perrow gets there himself! He walks right up to the idea of "externalities," and then he unfortunately drifts right back into waving his hands about how complicated all this technology is. Sigh.

(Also, this book could have stood a better copy editor. There was a significant number of typos, and one lengthy section where Perrow kept referring back to the position of "flying" on a certain chart, when the word didn't actually appear anywhere on the chart at all. Came to find out he'd used "aircraft" on the chart in question, and "flying" on a table showing a different arrangement of the same set of activities, SIX CHAPTERS LATER. Consistency isn't that hard, people.)

I sat down to write this review expecting I'd rate the book two stars, and instead I managed to talk myself into giving it just one instead. Maybe I should write about the books I actually liked more often, just for a change of pace.
Profile Image for Skyler Jordan.
29 reviews1 follower
August 29, 2024
“Normal Accidents” is an established classic for good reason. If I’m not mistaken, it pioneered the sociological study of industrial safety. This work is highly readable with well defined terms embedded in a clear and logical structure. Perhaps the greatest contribution of this study is the incredibly important demystification of “operator error” and how weak of an excuse for accidents it is. Of possibly equal value is the less mentioned discussion of safety devices and the risk of their implementation in complex and tightly coupled systems. An understandable limitation of the work is the narrow selection of concrete systems studied. A great deal of space is taken up with discussion of nuclear energy and, with hindsight, the extensive attention reads as excessive. As a whole, the theory of normal accidents is intuitive and important but at times feels regrettably under theorized. The new afterword in the latest addition does fine work analyzing and applying normal accident theory to more contemporary accidents but the final predictions about Y2K, similar to the dire warnings regarding nuclear energy, fall flat and may leave one feeling doubtful about the legitimacy of the rest of the work.
Profile Image for Jan.
6 reviews7 followers
August 13, 2013
The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most.
70 reviews2 followers
May 1, 2022
It's easy to take for granted how a lot of major large-scale systems we interact with "just work". From international banking to properly coordinated traffic lights. Failure of a few seconds by any of such critical systems have devastating consequences. How then are they best built to both reduce the likelihood of failure, and limit the effect of any inevitable failures.

While at the time this was first published (1984) accident theory was at best an emerging field, which goes to show the amount of work that had to be put in to create such a relevant body of work.

Here, Perrow analyzes some major and minor accidents industrialization comes hand in hand with and examines the inevitability of such an occurrence based on design and what can be done to mitigate them. The examples strongly show the idea of tight coupling and/or increased complexity are always bound to lead to significant accidents.

It also touches on the role organizational structure plays in influencing accidents as well as misprioritization of safety due to pursuing profits which raises some questions as to whether privatizing certain industries will sooner or later lead to catastrophes that affect innocent third parties and how regulation tries with little success to limit these.

It's easy to come away with a sense of how shaky systems we implicitly trust are only accidents waiting to happen after reading this, but in the right hands, it goes a long way to prevent these accidents even from the design stage. I think it's also useful in other fields eg software design or engineering. Highly recommend.

Touched on fields are: nuclear energy, air travel, mining and drilling, dams, marine accidents, petrochemicals, space exploration, DNA recombination & our ecosystem and weapon systems.
Profile Image for Joao Neto.
33 reviews
March 16, 2023
Very interesting book presenting a thesis for the Natural Accident Theory, which proposes that accidents are unavoidable and a natural consequence of tightly coupled and complex systems.
It presents very thorough examples on the TMI accident, nuclear and chemical industry, airplane and airways systems, naval accidents and space exploration.

The high points are the TMI accident analysis (and other nuclear plant accidents), the chemical plant accidents which are not so present in the zeitgeist and the analysis on production pressures on the airplane and airport industries.

The book has several issues however:
- Some chapters have a lot more detail than others. The TMI accident and the nuclear power part is very heavily discussed and analyzed to s degree that is not accompanied in other topics. That being the first chapter the book, we never go back to the same level of quality and research after that.
- The chapter on naval accidents is overtly long and very tedious as it delves in pseudo psicóloga theory on human errors from the captains of ships. It also goes into an inordinate amount of detail on multiple naval collision accidents.
- The chapter on space exploration is severely under detailed
- This so called updated edition has 2 after chapters that had a lot of potential if the author went into the same detail of the rest of the book on the Chernobyl, Bophal and Challenger accidents, however he only skims the surface and dumps a ton of references to books from other authors.
- I found it despicable that the cover uses the Challenger explosion for marketing purposes only, since this accident is not even covered on this book
Profile Image for Ali.
300 reviews1 follower
January 19, 2025
Really interesting read, though it took me some time to get through. I don't agree with all of Perrow's points but I love the way he thinks, and it was the rare book that I could disagree with, even vehemently at points, but still come away from knowing I'm better off having read it. Stylistically I wish he gave his examples more room to breathe, but otherwise found it very smooth and enjoyed the personality of his voice just as much as the information in context.
278 reviews3 followers
August 14, 2025
2 stars seemed low for a book with meaningful insights but the writing style made it difficult to get through. Many arguments were made without support or with support that is mainly qualitative but the main takeaway, "the dangerous accidents lie in the system, not in the components," is even more important today. Complex systems can be chaotic and the tendency to solve all problems by adding a feature can get us in trouble.
Profile Image for Abby Durrant.
16 reviews
September 6, 2023
even better on the second read. does spoil most nuclear power documentaries though
10 reviews1 follower
May 9, 2024
The analysis of various catastrophes was very interesting. However, I think this book is significantly hamstrung by the author’s claim that nuclear power should be banned - which in my opinion is not justified by his arguments. An especially egregious example of motivated reasoning was his argument that “the public” is afraid of nuclear power, and their feeling of dread should be included in our calculations - outweighing the benefits! I find it extremely unlikely that the author would apply the same argument to any other issue.
Profile Image for Jane Ryder.
37 reviews38 followers
March 21, 2017
I first learned about this book from the bibliography of a Michael Chrichton novel (I think -- Airframe, maybe?) more than 20 years ago, and I've read it three times since. I'm sure not everyone would find it as riveting as I did, and it can get a little dry and/or repetitive at times, but it's a fascinating exploration of complex systems.

Many negative reviews focus on the book's failings, such as the fact that Perrow's "doomsday" predictions haven't come to pass or that he doesn't offer workable solutions, but to me those are unimportant details. Taken on balance, the book is a solid introduction to concepts most of us don't give any thought to at all: what's involved in the dangerous technologies that allow us to go about our daily lives, and whether or not we can ever properly safeguard the systems we create to run those technologies.

The real takeaway for me is a better understanding of how the industrialized world actually *works*, and the role human psychology plays in it all (spoiler: sometimes we strengthen the systems and sometimes we weaken them). It's a reminder that we're part of the complex system of life on earth, not separate from it.

The "problems" Perrow identifies are often insoluble, unless humanity undergoes a global neurological evolution or the laws of physics change, but awareness can help us at least improve our ability to mange them.
Profile Image for AJ Armstrong.
43 reviews
October 3, 2014
I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to identifying systematic risk factors and their proper mitigation, he simply advocates abandoning anything that can't be made completely risk-free (an obviously absurd requirement). In the end, the whole exercise converts a valuable system analysis into a pseudoscientific justification for ivory tower liberal mores. The snide asides he casually levels at corporations or competing theories just reinforce the desire to put it aside and read one of the many much better and more balanced treatment of the topic by Perrow's abler heirs.
Profile Image for Kevin J. Rogers.
57 reviews13 followers
February 5, 2008
Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear interactions with tight vs. loose coupling. It sounds more complex than it is; Dr. Perrow makes the point that complex systems react and interact in often unanticipated ways, and the solution to managing those systems depends on the recognition of that fact. Clearly written, this book takes a potentially difficult engineering subject and brings it forward as a human question. Well-done, and written for the lay reader.
Profile Image for Geoffry.
8 reviews
May 22, 2016
I had heard quite a bit about this book, and it mainly delivered. I enjoyed the attempts to quantify system accidents. I appreciated the the invention of a framework to discuss accidents, work systems, and victims. I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention.

Unfortunately, I found myself disagreeing with some of the main conclusions of his analysis, particularly relating to abandoning certain kinds of technology, and his assertion that cost-benefit analysis should include how people feel about a technology or a particular industrial activity.

In particular, I don't think we should abandon technology because people dread accidents or over estimate the danger of a technology. That is a PR and education issue, not an accident prevention issue.
Profile Image for Evelyn.
692 reviews63 followers
April 15, 2013
(Read for ES module)
Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc.
Profile Image for Tagnahoor.
27 reviews2 followers
February 23, 2021
Wow. Thought provoking. This deeply changed the way I do my work (performance auditing). If what you do has a material effect on the way things are done, this is a must read. It's dry, it's dense, lots of it is over my head, but the moral of the fable is there and within your grasp.
Profile Image for Greg Stoll.
356 reviews13 followers
June 2, 2019
As I've mentioned before, I have a bit of a fascination with airplane crashes, and several books I've read mentioned this one as a seminal work in describing how accidents in complex systems happen.

The main part of the book is setting up a system for categorizing systems. One dimension is "loosely-coupled" versus "tightly-coupled" - this roughly corresponds to how much slack there is in the system. A good example of a tightly-coupled system is an assembly line if parts are going down a conveyor belt or something - if something goes wrong to mess up a widget at one station, that widget will quickly be at the next station which can cause other problems.

The other dimension is "linear" versus "complex", which roughly describes the interactions between parts of the system. An assembly line with a conveyor belt is a good example of a "linear" system because the interactions between the different stations are pretty predictable. Usually the more compact in space a system is, the more "complex" it is because lots of different parts of it are close together.

Tightly-coupled complex systems are prone to what the author calls "normal" accidents which aren't really preventable. Basically, when a system is tightly-coupled you need to have a pretty strict plan for how to deal with things when something goes wrong, because you don't have a lot of time for analysis or debate. (a military-like structure can help, although obviously this can have bad consequences for organizations that are not the military) But complex systems require more deliberation to figure out what's actually going on and possibly more ingenuity to find a solution.

It's interesting because in retrospect for each particular accident it's usually easy to see what went wrong and what the people involved did wrong. (or what the organization did wrong before that point) The author's point is that most of the time blaming the people involved is missing the point - these sorts of accidents are inevitable.

Most of the book is looking at specific systems (nuclear power plants, chemical plants, airplanes, marine shipping, dams, spacecraft, etc.), trying to categorize them, and looking at examples of accidents.

(I should point out that I'm grossly oversimplifying here...)

I think I mostly agree with his points, but I really don't have the depth of experience to know how reasonable his approach is. The book was written just before Chernobyl (so the part about nuclear power plants seems prescient), but there's also an afterward written in the late 90s about the Y2K problem and how maybe everything will be fine but there will likely be unpredictable serious problems, which didn't pan out. So I dunno.

The book itself is pretty academic and was kind of a slog to get through even though I am interested in the topic.
Profile Image for Bill Conrad.
Author 4 books10 followers
February 9, 2020
We think of “accidents” as tragedies that plague our lives. A car crash where a beloved family member dies. A plane crash in bad weather kills hundreds. Normal Accidents takes a high-level view and shows us that incidents should be expected and they can be predicted.
First off, this book is not a statistical analysis. IE, car crashes are X% likely. Rather, what Charles attempts to point out is that the more complex a system gets, the more likely an accident will occur. In addition, humans have many flaws that play a large part to play in causing and preventing accidents. Specifically, they attain a mindset that lulls them into a false sense of security. Normal Accidents provided a framework to recognize complex systems, and it raises awareness into the prospect of preventing problems. It also makes us consider how systems are designed, how they internally interact, how they connect with other systems and how human operators use them.
There is a lot going on in this book. It begins by taking a deep dive into the 3-mile island nuclear incident. The initial conclusion listed the primary issue as operator error. Charles argues that the complex system had many inherent flaws, complex interactions. These factors made an accident of this type inevitable. Why? He asserts that nuclear technology is relatively new and there are only a few plants around the world of that size. Therefore, the flaws inherent in its design had yet to be discovered.
This is a great book; a powerful book. It is important for us as a species to understand what we have built, who we are and where potential problems could be. I recommended it to an Engineering friend of mine.
15 reviews
June 28, 2020
This book describes a history of industrial accidents in a variety of industries. While the descriptions and analyses of specific accidents are very interesting, the greatest value of this book is the development of a novel theory of accidents. Indeed it appears that in the years and decades that followed the publication of this book, the author's theory has become a standard against which other theories are gauged. In the subsequent literature about safety and risk, the Normal Accidents Theory (NAT) is often discussed along with its competitor, High Reliability Theory (HRT).
Perrow submits convincing arguments that complex systems are bound to fail catastrophically sooner or later - and this would include not just technological systems, but other complex systems as well (e.g. our world economy). Complex systems are analyzed with respect to the number of interconnected parts, redundant components, additional fail-safe components, how quickly anomalies will propagate within the system, system fault tolerance, the potential for damage, etc. The book concludes with a discussion about risk tolerance, and what constitutes acceptable risk. The author recommends abandoning nuclear power, figuring the risks are not worth the benefits - this is amazing, as he practically predicts the Chernobyl disaster, which came to pass only a few years after publication of the book. If you read this book in combination with Ellsberg's Doomsday Machine, you'll be seriously concerned about the survival of our species beyond the present atomic power/weapons era.
Profile Image for Andrew Breza.
509 reviews31 followers
February 16, 2023
I first learned about the theory of normal accidents in grad school from my professor, Howard McCurdy. The fundamental idea is that some systems are so complex and their components are so tightly coupled that major failures are inevitable. Not only that, but many of these failures will be incomprehensible to the system's designers and operators. Investigators often blame "human error" instead of recognizing that the system itself can cause errors through its design.

I recently found myself working on a complex software system when I remembered this theory. I decided to read Normal Accidents in hopes that it would be relevant. Despite being published in 1984, it remains eerily prescient as automation plays an increasingly important role in society. The increasing use of machine learning and artificial intelligence poses a real risk of introducing normal accidents into heretofore non-complex aspects of life. As I write this review, Google and Microsoft are racing to see who can be the first to integrate a Large Language Model (LLM, a type of AI) into a general release of a search engine. This type of model is difficult to control. Both companies' recent demos featured factual errors from their LLMs. What will the consequences be when companies push immature or poorly understood models into production to satisfy demands from management? In short, the answer is "normal accidents."

I highly recommend this book to anyone working with computer systems, especially if you're a manager.
Profile Image for Artur.
244 reviews
February 25, 2023
As a software engineer, I admire this book for the framework of thinking about the systemic complexity and tight coupling it puts forward that is originally applied to mostly physical systems like petrochemical and nuclear plants, aviation, marine transport, dams and so forth, but can be adapted and is quite well suited to my own field. In one sentence, normal accident is something that will happen to any complex enough system that has dynamic interactions between its components given enough time. We as a civilisation have reached the stage where those kinds of systems comprise the foundation of our economy and the way of living long ago and they are not getting much simpler in the foreseeable future. Thus, the idea posed in this book is evergreen, never to lose relevance. The book itself has some minor issues like generalisations from anecdotal evidence at times and speculation about certain systems like nuclear warning system computer network that do not seem to be correct based on what I know about computer systems. Otherwise though, in the theoretical sections and in the review of accidents where there is a wealth of evidence and prior research that Perrow could use the book is sound and profoundly thoughtful. I liked chapters on TMI, aviation and marine transport along with the conclusion the most, so if you don't want to spend much time to read the volume you can read those and skim through the other pages stopping when you see something curious still getting good value from this book.
Profile Image for Kevin Mccormick.
8 reviews2 followers
February 22, 2020
Overall, I found this book interesting, but not particularly compelling. Normal Accidents starts with a thought-provoking first chapter, suggesting the central thesis of the book. However, as the chapters wear on, it becomes more apparent that his theory best applies to nuclear energy. For subsequent chapters, it starts feeling more like a square peg in a round hole. There are still plenty of individually interesting stories about catastrophic accidents in a variety of industries, which I did find educational enough to finish this book.

The central idea that Perrow suggested is still incredibly applicable today - I work in distributed software systems, and the concepts presented - systems accidents, component failure accidents, tight coupling, complex interactivity, and error-inducing systems seemed aptly descriptive of the challenges present in building such systems today. However, I probably wouldn't recommend this book for anyone who is looking to gain deep insight beyond the first chapter - it simply isn't worth the time to read the rather lengthy middle-part and anti-climactic conclusion. If you're simply looking for a loosely-knit together collection of well-researched stories about industrial accidents, this is worth reading.
Profile Image for Jimbo.
26 reviews3 followers
February 15, 2021
This is an excellent read and extremely relevant to today, given the pandemic we’re experiencing. I was pointed to this book by Zeynep Turfecki as a way to think about ‘systems’, how things are related, and how failures can cascade. The original book is from 1984, and as I was reading I thought, ‘what about Bhopal and Chernobyl?’ Then I get to the afterword, and there it is! This is definitely a book that retrospective helps provide context. The Y2K treatment in the afterword is good. It is ironic that the dot-com crash would be the event that caused a recession, not Y2K. The book is somewhat dense, with lots of acronyms. I read on Hoopla and it was not easy to reference the acronym helper at the end of the book. I’d love to see this book revisited again brining in the 737Max failure as well as overall Coronavirus response. I would have liked to have kept better notes as I read, but was reading and discussing with a friend. The book can be terrifying at times, and if you have fears of nuclear war, airplane crashes, or other disasters, I would not read this book. I thoroughly enjoyed this and learned a lot from it.
Profile Image for João Paiva.
44 reviews6 followers
November 6, 2022
This is a book focused on accidents. It subscribes to the theory that when a system is complex and highly coupled, system accidents are unavoidable. To show this theory in effect, the book compares accidents across a wide range of fields, including nuclear power and nuclear weapons, marine and air transport, chemical plans, damns, mines, space exploration, DNA manipulation, and others.
I really appreciated the diversity and depth of accidents that are explored in this book. I also like how the author clearly distinguishes systems accidents (caused by high complexity and coupling), and component failures. The book also focuses very strongly in nuclear power (and the Three Mile accident in particular). This is good and bad: I learned more about the dangers associated with nuclear power (and about what happened in TMI), but at a certain point almost starts to feel like nuclear power is the prime single example of highly dangerous technology. Overall, I enjoyed this book even though at times (particular towards the end) it felt a bit repetitive and a bit link a long rant against nuclear power.
8 reviews
August 23, 2025
First of all, you have to take into account that this book was released 1984, which was before more prominent accidents like Chernobyl or Fukushima which would lend very well to this topic. Technology changed quite a bit since then, so sometimes the book feels a little bit like picking it apart to identify the parts that aged well.

This was not a short or an easy read, there are a lot of detailed analysis of (by now very old) accidents to chew through. Sometimes I was still impressed to read about some major accidents that I've never heard of.
I still find the book interesting and worthwhile reading though. I was definitely reading this with my background in software engineering where an industry strives to build distributed systems but ends up with constant "system accidents" due to complexity and tight coupling which are two points continuously showing up in Perrow's analysis too.
44 reviews
January 1, 2021
Overall Impressions
It’s a solid theoretical work on risk and organizational behavior, but a difficult read unless you enjoy lectures delivered with all the pompousness and self-assured arrogance of a late-20th-Century white male academic.

The use of first-person “I” statements grew annoying, as did the repeated use of the word “elites” to describe powerful people making decisions (usually to force risks onto the many for the profit and benefit of the few.)

Some Significant Points
• Accidents can result from multiple failures of system components including Design, Equipment, Procedures, Operators, Supplies & materials, and Environment. Abbreviated with acronym DEPOSE. (p8)

• Systems can be tightly-coupled or loosely-coupled, and Linear or Complex, then plotted on a matrix for analysis.

• The Normal Accidents Theory says that large accidents caused by the interactions of multiple small failures is inevitable. The consequences vary depending on the Linear vs Complex and Loose vs Tight coupling.

• Common Mode Failures occur when one component affects more than one process, e.g. a pump used for both circulating coolant in one system and providing necessary heating to another. It’s efficient design, but if the pump fails then both systems are affected.

• Some systems are inherently error-inducing and others error-avoiding. Error-avoiding example is air travel. Pilots, the public, and airlines are all punished by crashes and have incentive to prevent them. For maritime shipping that is not the case. It’s rare for individual ship to sink, the consequences are also not felt by consumers directly, insurance pays the owners, etc.

• Safety devices are another system that can fail and interact in unexpected ways; often because they are added later and are not part of the original design.

• Safety devices often allow more risk and the accident rate remains unchanged. Ships used to slow down in the fog. Radar allows higher speeds which reduces the time to correct course and avoid collision. Also, in good weather it allows higher speeds which makes the consequences of collisions greater.

• In crisis situations we make mental models to reduce ambiguity so we can take action. As information comes in, we try to fit it into the mental model. If it doesn’t fit, we are more likely to question or reject the information than to alter the model.

This is how two ships on courses to safely pass each other have collided due to a last minute maneuver where one ship crosses directly into the path of another. (Which happens alarmingly more often than you might expect.)

Example. Ship captain thought he was overtaking another ship traveling in the same direction at night. Thought he would pass on his starboard side (the port side of other ship). In reality the two ships were going in opposite directions toward each other.

Staying on course would mean no collision. As gap narrowed captain could see ships were too close so adjusted to the port; thinking it would create more buffer as he overtook. Results showed the ships closer than ever so he adjusted again hard to port.

Mental model was flawed so instead of his actions increasing the space, he crossed directly in front and caused a collision.

• A lot of accidents are caused by bias toward continued production. Keep the process running. When something starts going wrong we search for a minimum impact explanation first and don’t consider the catastrophic. (p277)

• “The main point of the book is to see these human constructions as systems…” “…the theme has been that it is the way the parts fit together, interact, that is important.” (p351)

• There is a discussion of expert risk assessment being different from public risk assessment. The differences are influenced by dread, consequences, and catastrophic potential.

• Insurers used to insist on safety measures that lowered their exposure to payouts. This collaterally benefitted workers and communities. As insurers shifted to gaining profits from finance, they reduced inspections and insured more operations to gain more dollars from premiums to invest. The led to more accidents due to lower standards, but it was acceptable to insurers because it was affordable. (p361)

• Risks and injury statistics can be outsourced by hiring subcontractors. The plant appears safer because the company only reports injuries to their employees. (p362)

Conclusions
The material is strong, the theory is solid, and the expertise is unquestionable (he participated in the investigation of the Three Mile Island nuclear plant accident), but the style made reading the book an exercise in perseverance in the later chapters.

The tone was matter-of-fact and not condescending (mostly), but thick with unconscious biases that were common at the time but have grown increasingly unacceptable twenty years into the new century.

For example (emphasis added):

“We do not know the extent to which… [production schedule demand forces errors] …or, on the other hand, the extent to which there is a ‘macho’ culture that provides psychic rewards for risk taking. I am sure that the first exceeds the second; a risk-taking macho culture has probably developed to make sense of…” (p246)

“…rather like the ritual with a cannabis joint.” (p359)

“…as an organizational theorist, I am familiar with all the problems that can occur and the mistakes that can be made…” (p388)

The lack of humility, and confidence in wild assumptions are grating, and this is not the first time I’ve been annoyed in that way but still found value in the material.

[original review 7/2020; edit 1/2021 to correct spelling only]
Displaying 1 - 30 of 79 reviews

Can't find what you're looking for?

Get help and learn more about the design.