When the Space Shuttle Challenger exploded on January 28, 1986, millions of Americans became bound together in a single, historic moment. Many still vividly remember exactly where they were and what they were doing when they heard about the tragedy. Diane Vaughan recreates the steps leading up to that fateful decision, contradicting conventional interpretations to prove that what occurred at NASA was not skullduggery or misconduct but a disastrous mistake.
Why did NASA managers, who not only had all the information prior to the launch but also were warned against it, decide to proceed? In retelling how the decision unfolded through the eyes of the managers and the engineers, Vaughan uncovers an incremental descent into poor judgment, supported by a culture of high-risk technology. She reveals how and why NASA insiders, when repeatedly faced with evidence that something was wrong, normalized the deviance so that it became acceptable to them. In a new preface, Vaughan reveals the ramifications for this book and for her when a similar decision-making process brought down NASA's Space Shuttle Columbia in 2003.
First, I have to commend Vaughan for a stunningly researched and wonderfully structured book. It's extremely academic and linear - two qualities that happen to work for me, especially when tackling a subject as complex as The Challenger.
I found her sociological argument to be compelling and well made, though not always sufficient. Several times it seemed to me that she minimized individual mistakes to make her grander point about organizational culture. She also undervalued the warning memos written by engineers about the O-rings, which were very clearly written and appropriately worrisome. And, she actually hid in the appendix the crucial fact that blow-by increased after they were forced to use a new putty!
I think her points about the culture at NASA mostly hit the mark. But I do not think you can blame the culture entirely. There were individuals who were making the decisions to "normalize deviance," to use her terminology. There were individuals who ignored the pleading of experienced engineers who had intimate knowledge of the SRBs and previous experiences with blow-by. At what point were these managers rationally expected to break out of their cultural stereotypes and make a responsible decision? Because there had to be such a point. I think it was reached during the teleconference, but Vaughan seems to be giving them the benefit of the doubt. (And, really, who am I to say otherwise? Having the benefit of hindsight and having been influenced by reading too much Feynman, I was probably biased going into this...)
Still, this is quite a book. I can't stop pondering the appropriate use of engineering intuition, or feeling, without hard data backing up the intuition to make important, game-changing decisions. And how does one successfully navigate its use in a hardcore science context? It's a fascinating question.
A remarkable and even mind-blowing book, but I have trouble recommending it wholeheartedly because it is too wordy and it features some important errors and lapses.
-First Vaughan disproves the conventional wisdom that NASA managers failed to listen to clear warnings from engineers. She's very convincing about this--enough said. OK, so what did go wrong? -Not yet. Next we get a detailed history of the Shuttle Program with a focus on NASA's overall style of bureaucracy and problem-solving. Turns out their procedures are very complex, full of internal jargon, etc. Duh. The important aspects here are what Vaughan documents about the momentous transition from Apollo to Shuttle: a transformation from an R&D outfit sending heroic astronauts out on dangerous moonshot missions to a sort of a routine transport business that takes passengers and cargo to a destination that just happens to be outer space. The NASA jargon for this is that they went from "developmental" to "operational" technology. The Apollo deaths were not catastrophes requiring multiple outside investigations, because everyone knew the program was risky. The Challenger deaths were a scandalous public disaster because there was a schoolteacher on board. Getting on the Space Shuttle was supposed to be like getting on a commercial airplane. -Vaughan shares a lot of sociology theory to categorize what's going on, but I did not find that very helpful. It gets worse as we get to the conclusion, which talks a lot of gibberish about scientific paradigms. Belief in redundancy of O-rings when you have a primary O-ring and a back-up O-ring is not a paradigm--it's a tautology. A paradigm is something like "the Earth is the center of the Universe." -Eventually, we get a blow-by-blow of the actual launch decision. There's the typical cascade of errors one would expect in any disaster (like the most knowledgeable NASA O-ring guy was never involved because people thought he was out of town and didn't bother calling his home.) Vaughan provides the actual hand-written charts shown at the meetings on the eve of the launch. The Thiokol engineers did a very poor job making their case to NASA that it was too cold to launch the Shuttle. They admit that. -[This demolished another urban legend I had been taught that the problem was bad PowerPoint slides, with the key information buried in too many bullets. First of all, it's not PowerPoint slides. Secondly, the information presented as a whole was bad. The message about no-go in the cold was totally clear to everyone, but the content supporting the recommendation was objectively weak, and therefore unconvincing. Vaughan does not address the PowerPoint story.] -In Chapter 13, Vaughan finally reveals the big mind-blowing thing. This was not a problem of "uncertainty" and difficult decision-making in real time with inadequate information, blah , blah , blah. The data that would have convinced everyone were there the whole time, but no one involved ever asked for them. The graphic that was needed at the pre-launch meeting (not to mention months before that) was not put together until after the disaster and it was done by the investigation staffers. This is a very simple graph illustrating how the only Shuttle missions with zero O-ring damage occurred at temperatures over 65 F, so below that temp, damage was much more likely. One explanation we get for the failure to conceive of this graph was that every single individual at NASA, Thiokol and the rest was "not smart enough" to think of it, according to one engineer. -Note: factual error by Vaughan saying people who made the graph were non-engineers. Alton G. Keel, Jr. was an aerospace engineer and staff director of the Presidential Commission. This is a pretty huge thing to get wrong so it makes me question her thoroughness and accuracy overall. -For me, the key mystery is why that happened with that graph's absence, because all those people were not that stupid. What was lacking was a 2 x 2 table: +/- cold as input variable, and +/- O-ring damage as output. This is super-basic science. It's related to the concepts of the "critical experiment" or the control group, or whatever. Instead of a 2 x 2 table, the Shuttle people had a 2 x 1 table: +/- cold as input but only + damage for output. I'm not an engineer, but any basic textbook of scientific evaluation methodology will start with 2 x 2 tables. So what made all those people act as if they didn't know this? -In an endnote, Vaughan quotes a paper suggesting that there is a total deficiency in learning about covariate analysis in the entire engineering profession. She rejects that without explanation. I have to agree it seems preposterous. Her explanation for the collective failure to make a 2 x 2 table is that it was easy to think of only in hindsight. I find that hard to believe for the reasons explained above. Moreover, I was disappointed that this most crucial point gets just a footnote in this very long book. So what's a better explanation for the shambolic non-analysis of cold vs. O-ring damage? -I would guess the central problem was a culture of corruption and incompetence. (See Detroit: An American Autopsy.) Vaughan describes how the entire Shuttle program was based on the "myth" that they had achieved routine space travel. If your organization is all about bull****, then either you call bull**** or you don't. Some scientists outside NASA had been calling bull on the whole manned space program, but it seems like nobody inside NASA or any of the contractors did. Maybe those who did couldn't stay. If you stay and go along with the BS, then you have to live in denial and start corrupting information and performing in incompetent ways. In that kind of environment, there's a feedback loop of increasing incompetence, and so people stop thinking hard because the cognitive dissonance is too painful. Vaughan's confused explanations about "scientific paradigms" and whatnot make it seem like the problem was too much attachment to science/data/quantitative evidence, etc. But if you know science, that makes little sense, because the problem was fundamentally bad scientific practice by everybody. Belief in a myth is pretty much the opposite of science. The giant safety conferences that Vaughan describes in excruciating detail seemed like elaborate "scientifical" theater to me. Vaughan makes a big deal about how adversarial these sessions were and how the NASA engineers asked tough but fair questions that pointed out how weak the Thiokol presentation was, and that this was how the system was supposed to work. Except if nobody anywhere in the system over a span of months is asking for the equivalent of a 2 x 2 table, then no it's not working as a serious scientific assessment. -The question then isn't so much How could the Space Shuttle explosions(plural) have been avoided? It's more Would we have been better off saying No to the whole mythological shuttle program in the first place? and failing that, Could we have been honest about how this was a risky experimental space rocket and not put school teachers on it until it really was as safe as a Subaru? -More broadly and still of great relevance today: When will our corporations, government agencies and other institutions learn from these sorts of catastrophes, stop bull****ting and start building trust by producing useful results? In 1964, in the time of the Moonshot program, more than 75% of Americans trusted the federal government; now it's about 25%.
Thick book that fascinates as we all think we know what happened the fateful day the Challenger exploded: managers requested a hold and they were overridden to hold the schedule. What Diane Vaughan, a sociologist, asks to great effect is why we believe committed professionals made a choice they knew could be a public catastrophe. The more you ask that question, the more you wonder what really happened?
This book answers that question with one phrase that resonates: normalization of deviance. In the culture of accepting high risks to begin with, there was an increased acceptance of data that normal people see as deviant. Eventually the cycle of normalization has the culture operating so far outside the rational that a catastrophe occurs.
A truly fascinating read of an insider look at the decisions that led to national tragedy; one of which we're all aware.
A really thorough and interesting "ethnographic" analysis of the culture at NASA that lead to the 1986 catastrophe of Challenger. The culture described is relevant for other engineering settings but also for understanding the dynamics between managers and experts as well as "silo culture". I would recommend this book to anyone believing in "toxic managers" and "hierarchy" as the culprits of adverse culture. What I found most interesting indeed is that we are not dealing with a classical "Tayloristic" culture at all - they had lots of feedback frameworks in place that we would call "modern" but still it happened. And, for readers from 2021, it is a takeaway that in the crucial launch decision meeting, one adverse factor seemed to be that people were not able to see each other (audio participants in the teleconference).
If the book was shorter, I would have given it 5 stars. As it is, with 400 pages in total (without Appendix!), some chapters are quite hard to read and the topic is sometimes analyzed a little too deep for my taste. Anyway, I am now an expert on O-rings and solid rocket boosters :)
It is difficult to overstate what an extraordinary volume this is. Vaughan, over nearly 600 pages, meticulously and systematically reexamines the political, cultural, and technical context of the signal technological failure of the space age.
This is not light reading—at times the author's desire for rigour crosses a line into rather tiresome repetition—but it is nonetheless well written. Technical concepts are clearly explained, and the seemingly endless complications of NASA process and culture are laid out in detail at appropriate moments. For most of the book, Vaughan replays in "ethnographic thick description" (her term) the working patterns and experiences of the Solid Rocket Booster teams at Morton Thiokol and NASA's Marshall Space Center. It is through these detailed descriptions that she explains the process, jargon, and cultural context apparently missed or misinterpreted by the Presidential Commission, and how these aspects contributed to the tragic 1986 launch decision.
The Space Shuttle project is one of the largest and most complex engineering projects ever undertaken, and this book convincingly dismantles a central tenet of Challenger folklore: that the disaster was simply explained and could have been easily avoided. Vaughan argues that this represents a gross misunderstanding of the reality. Namely, that the disaster's proximate cause (the failure of an SRB joint due to cold launch temperatures) can only be understood in a much broader and more complicated cultural context. The cultural prerequisites for the joint failure were harder to detect, and harder to address.
In all, a must read for anyone interested in the history of organisational dysfunction, the role of culture in technical organisations, or the complexity of technical systems failures.
Right up front, know that 'The Challenger Launch Decision' is an academic book that takes a sociologist's look at the cultural factors leading to the flawed decision to launch the Space Shuttle Challenger on January 28, 1986. In other words, this is a heavy lift.
That said, I tore through it in just a few days. And I'm not even a sociologist or engineer.
This is a book that's heavy on engineering, sure. But its real insights are into how extraordinarily intelligent, exquisitely well-trained, conscientious, and responsible people can get something so important as the decision to launch the Challenger that day so catastrophically wrong. To that end, author Diane Vaughan explores the culture not just of NASA, but of its contractors. She explicates how the precursors to the disaster became normalized, and how the culture of spaceflight left no room for engineers' gut feeling that it was wrong to launch the spacecraft in the cold temperatures of that morning (In effect, engineers couldn't walk into a room and say, "I have a bad feeling about this." To make virtually any assertion, they needed reams of data to back themselves up.).
Additionally, the author attacks the popular narrative of "bottom-line oriented managers overruling truth-telling engineers" as the explanation for the disaster. She illustrates how experimental space flight still was, and how there was no consensus among engineers regarding the best way to deal with the shuttles' now-infamous O-ring problem.
If you're involved in high technology in any way, this book is a must-read. If you're involved in running large government or private-sector institutions, this is also a must-read. It's dauntingly huge, sure. But if you're like me, you'll blow right through it.
I am a non academic and someone without a strong mathematical background. At the same time, I read quite a lot about neuroscience and some dense history. All that said, this book was incredibly hard work. While incredibly insightful content about group think and the fatal inability to differentiate between accuracy and precision, this book could use some pruning and a strong editor. Only a read for the interested academic or someone with a hearty appetite and good dictionary.
Truthfully, I would more highly recommend Allan McDonald's "Truth, Lies, and O-Rings" over this book. I was excited to read it, but found it to be disappointing. While I'd agree that the decision to launch Challenger was more complex than we think, I also think that the author misrepresents many parts--especially including McDonald's role in the launch decision, so much so that I couldn't really recommend this one over just about any other book on the topic.
This book, while not light reading, is an important work. It is one of the best investigations into the Challenger disaster out there and will be appreciated by those interested in that story from history. This book, however, is not just a history book. It explores how bureaucracies work and why they often fail. It reminds the reader that a bureaucracy's failure is not necessarily the failure of one man in particular, or a result of gross incompetence at any step in the decision making process... that, often, these failures are rooted in the systems and cultures within which people work. That message can be eye opening, and is applicable to better understanding the workings of other government bureaucracies, and even, perhaps, our own private employers. This is one of those rare books that can change the way you look at the world around you.
There was a lot of information that was presented in this book, however it needed serious editing. The book was too dense and wordy and was in serious need of concise writing, probably reducing the page count by 100 pages. It comes down to the conflict of safety vs. whether to launch. It comes down to the space shuttle program tempting the laws of physics too many times. If the temperature is too cold, below the rated temperature of the O-ring seals in the solid rocket booster, you should not launch, until the temperature is more favorable. The author could have made an even stronger case that a few hours delay to a middle of the day launch would have made more sense. The author uses social-analysis of the management structure and could have been much stronger by the use of consise writing. I am not able to read this book in its entirety - I find myself skimming the pages.
As I understand it, this is the seminal report on what caused the Challenger tragedy. It's interesting, although 600 pages long (although only 450 pages before the appendices!) and very technical and detailed, so know what you're getting into!
No one questions that the technical cause of the disaster was that it launched in extremely cold temperatures, and the O-ring that was joining the segments of the Solid Rocket Booster failed to seal because it was made of something like rubber that gets stiff at cold temperatures.
The Rogers Commission was formed by President Reagan to investigate the accident, and they more or less blamed middle management at Morton Thiokol (the contractor responsible for the Solid Rocket Boosters) on not listening to warnings from their engineers in violation of NASA rules at the time.
The book is structured as a straightforward retelling of the Rogers Commission's report, then a deep look at the history of the Solid Rocket Booster program and their procedures, and finally going back to the report and interspersing it with more details now that the read has more background. It's a very compelling technique!
Vaughan dives much deeper into the history of the Solid Rocket Boosters and came up with the following conclusions: - No rules were really broken by Thiokol. While some engineers were convinced the cold weather would be a problem for the Solid Rocket Boosters, the data was weak and noisy, and some engineers thought it would be fine. And even the ones that thought it would be a problem just thought it would cause more damage to the O-rings, not that anything catastrophic would happen. - The biggest problem is what Vaughan calls the "normalization of deviance". The O-rings were problematic from the beginning in various ways, and they had problems with erosion and blow-by from the second shuttle flight. This kept happening intermittently, but because the shuttle had launched safely so many times, they became convinced the damage to the O-rings were normal and not a real threat to the safety of the shuttle. In Vaughn's words, they "redefined evidence that deviated from an acceptable standard so that it _became_ the standard". - Another issue was that the engineers kept trying things to fix the O-ring problems, and were convinced for a while when they happened to have a shuttle flight with no damage that they had fixed it, right up until the next problem happened. They didn't have as much funding as they wanted (like any engineering project...), and it seems like they didn't have a good way of testing the O-rings in a realistic situation other than launching the shuttle. - They knew the O-ring had problems, but redesigning it would have led to more uncertainties; such is the way of engineering. - The worst damage they had seen to the O-rings occurred at the coldest temperature they had launched at. (53 degrees F) However, the second-worst damage they had seen happened at a much warmer temperature of 75 degrees F.
The night before the Challenger launch there was a special teleconference between Thiokol and NASA (specifically the Marshall Space Flight Center) to talk about the low temperatures. The temperature was forecasted to be 28 degrees F, significantly colder than they had ever launched at before. (later investigation found that the right SRB, which had the O-ring that failed, was actually at 8 degrees F) Thiokol recommended not launching, and that in fact the shuttle shouldn't launch if the temperature was lower than 53 degrees F. (the previous coldest temperature) NASA pushed back a little, and Thiokol got off of the call to talk amongst themselves for a while, after which point they reversed their decision and recommended launching.
The problem here was that the evidence was shaky about the relationship of temperature to O-ring damage; as mentioned above, the second-worst O-ring damage happened at warm temperatures. If they had looked at all their data points they would have seen that the O-rings always had problems at temperatures below 65 degrees F. (and only ~20% of the time at temperatures above that)
NASA was also used to running meetings where you had to have solid technical arguments to say that your system was safe for a shuttle launch; the meetings were famous for being adversarial and allowing anyone to challenge presenters. This teleconference was run in a similar way, and some Thiokol engineers felt like they had to _prove_ that it was unsafe to fly, which they didn't have compelling enough evidence for.
There were also some technical difficulties with the teleconference itself - it was done over the phone with no video, and in fact there were more engineers that had hesitations about launching, but they thought the argument had been made and they had lost, and didn't communicate with each other much. The people at Marshall were actually surprised Thiokol reversed their decision and said it was safe to fly, but after they did that no discussion was had; the Marshall folks didn't know whether Thiokol had come up with more compelling data that it was safe or what. (they had not) Vaughan groups some of these under the category of "structural secrecy"; I'm honestly not sure I'm doing it justice here.
Near the end of the book Vaughan another book she wrote about how relationships fall apart, and points out that the normalization of signals of potential danger contributes there too! People in a relationship see a problem but then the problem goes away, so they assume the problem wasn't a big deal, when in fact it's a big warning sign.
Long but worthwhile research piece, and origin of the phrase "groupthink." It has been a while since I've read this, but I don't remember the prose as being stellar. Still, the sociology nerd is fascinated by the methodical way in which the author unpacks the process by which good people not only make bad decisions, but where good decisions are structurally prevented by the nature of the bureaucracy. Disagree with your boss!
As a safety representative in a large organization with codified methods for submitting safety concerns this has been essential reading to my continued education. This mishap was not about the O-ring. It was about the paradigm in which evaluations of the O-ring were made. A structured, seemingly thorough way of determining safety concerns which showed its problems in one large explosion.
It was a crisp early afternoon. I was the public affairs specialist for the Annapolis Naval Station. It was my first job as a journalist. I had recently graduated from the Defense Information School, Fort Benjamin Harrison, Indianapolis, Indiana and Annapolis would be first first job after changing rates from Aviation Machinist Mate to Journalist. I really loved working on the A-6E Intruder all-weather, medium-range, attack bomber, an aircraft that could carry a payload similar to that of a B-52 Bomber and the only other aircraft that carried those bragging rights.
So clearly, I was into aviation. On this day, the space shuttle Challenger was going to launch and I had the opportunity to watch it live. I rushed home to my Naval base housing located at 4B Fig Court where my wife at the time, was home. She was busy in the kitchen making me lunch as I sat in my recliner, with my uniform still on, as I was headed back to work after the launch. It was nearing the time I had to return, as I only got an hour for lunch, so I waited with anticipation as the launch sequence was announced on television. It was really neat to be able to watch this launch on TV because I was a new journalist. I was open to all sorts of newsworthy stories. I wondered how I could write a story for the newsletter I was creating for my new command and spin a space shuttle story in there. How could I make it relevant? What would be my military tie or even more importantly, how could I connect the people on the ship with my new duty station? Had any of the astronauts attended the U.S. Naval Academy? If so, that would be the tie.
3, 2, 1. Lift-off. We have lift-off. 73 seconds later, I summoned my wife. I said, "Hey, come in here, something doesn't look right!"
That was 33 years ago.
As I read the book, I just shook my head in disgust.
If you lived through this event -- I mean really lived through it (not just that you were alive when it happened), then you should read about the space shuttle Challenger and the events that lead to its destruction and the lives that were lost.
I went on quite a journey with this book. Vaughan offers some incredibly valuable insights about how organisational cultures drive poor decision-making, even in people who are hard-working and decent. I learnt much about what the Presidential Commission on Challenger didn't consider or emphasised as important re: the decisions that were made to launch the Challenger and subsequently kill seven astronauts.
It's interesting that Vaughan posits that the decisions made to launch the ill-fated Challenger were complex and nuanced, because I often found she herself does not offer very much nuance in her theories. She offers two positions: one of managers as "amoral calculators" (and boy does she beat the reader over the head with this term), and the other as engineer-managers who were incredibly bright, making decisions congruent with a "normalization of deviance" due to organisational culture and pressures. There is little room for anything in between. She pays little attention to any discussion of individual accountability or wrongdoing, and I found that exasperating.
My view, however, was coloured by my repeated viewing of the Challenger documentary on Netflix. I found Mulloy and Lucas to be deeply unsympathetic individuals in said documentary, and I still feel that way, although I have more understanding for Mulloy since reading this book. Lucas? Erm, no...
The book is long (too long), repetitive and technical. It desperately needed an editor. If you want to read a more accessible book about the story of Challenger, please read Adam Higginbotham's book, "Challenger: A True Story of Heroism and Disaster on the Edge of Space." 5/5.
I read this book for a book report for my Contemporary America class, and I enjoyed it. It was super informative, and includes probably absolutely everything there is to know about the Challenger disaster. I found a lot of information about the different branches and positions at NASA to help me with future careers that I am interested in, so I am very glad I read this book. I found out a lot about the mission that I didn't know before. Vaughan's writing style is interesting and she keeps the material from being dry for the most part. She tended to drag out certain aspects at some points, but it is a very technical part of history, so that was expected. I really appreciated the fact that she did not just focus on the technological aspects of the tragedy, and incorporated the psychological and sociological aspects of the launch decision making process. Overall, I am glad I read this book, especially considering that this is the event that got me interested in space exploration.
Interesting and convincing interpretation of events, but tedious, academic sociology writing drags it down. The most interesting parts are at the end. The Presidential Commission had two non-engineer members that produced a chart with data that Thiokol had on launch eve but did not make, showing that all launches below 65F had problems with the O-rings. They did this by adding in the failure-free launches. Big difference; if that had been presented in the launch eve telecon, the outcome might have been different. Bayesians care about the denominator.
The O-rings that failed had charred through, but might have held and the vehicle would have reached orbit had it not been for wind shear at altitude that opened up a gap. That is the reason it did not blow up on the launch pad, as only one Thiokol engineer feared. (The rest who opposed launch thought only there would be erosion.)
It took me a while to finish this book, but this one of the fundamentals in Safety Science than any safety practitioner should read. Years ago, I was in a conference led by one of those very expensive safety experts and she talked a lot about the Challenger and how the various decisions and small deviations created small fractures in the fabric of the socio-technical organisation until the disaster happened. At that time, to be honest, I did not understand the message, since I was caught up on the righteousness of R. Feynman minority report on the investigation (he got several things wrong, as it happens...); now having read this after the Perrow and a tons of other books on safety science, all makes sense. By and large, this is how to do an incident investigation and how to capture lessons learnt
An incredibly well researched piece on engineering culture that should be required reading for all engineers. Puts forth a revisionist explanation for the decision to launch STS-51L, based around an immersive account of NASA and Thiokol working group culture, which shows that the Challenger disaster was not the product of launch pressure by NASA middle managers (as traditionally believed), but rather the overall technical and beaurocratic culture at NASA which led to normalization of deviance in SRB performance. It is a sharp warning to anyone embedded within an organization about the dangers of becomming inculturated and blind to gradual acceptance of risk. However, it is quite wordy throughout, which made it a little tedious to get through at times. Certainly worth the read despite that caveat.
Diane Vaughan's work about what happens in hierarchy-heavy organizations that deal with 'unruly' technology, is well-researched and insightful, albeit a bit thick. The phrase 'normalization of deviance' draws multiple parallels in my line of work (which is a far cry from rocket science), which just shows the principle is sound and something to keep a cautious eye for. Not to mention, how the work-group functions with looming delivery pressure, constructing and normalizing risk, and most important of all, confirming to rules can be disastrous even though they are backed by well-intentioned brilliant minds.
As a 90s kid truly interested in all things space related, I was excited to read this book. Once it became discounted, I downloaded immediately. Although informative, it was very wordy and the author could have made the majority of her points more concisely. Quite a bit was redundant causing me to skim the pages because I felt I was reading the same paragraphs multiple times. Mr. Allan McDonald’s truths lies and O-rings is THE book to read as it pertains to the challenger tragedy. I highly recommend his account-not this one. Not to say that everyone won’t enjoy it but I found it tedious. God rest gently the Challenger crew🇺🇸
Brilliant - strangely perhaps I related to this as the non-fiction equivalent of Middlemarch. It stuns with its capacity to capture all the nuance, complexity, ambiguity and richness of a point in time and place.
5 stars for content, the main issue is that it's written like a structured academic paper and repeats the same points several times over. There are new bits of information sprinkled throughout the whole book (so read the whole thing), but it feels like reading the same chapter 10 times.
An amazing book that provides a thorough overview for a better understanding of the Challenger disaster. Challenging traditional narratives, Vaughan highlights the formation of organizational culture and its slow-moving effects on operational safety.
A classic text in STS (science and technology studies), Vaughan's book is a terrific examination of the social and cultural factors that led to the Challenger disaster. I often teach excerpts of this book in more advanced classes to illustrate how group think functions in technical networks.
Great historical account of the decision-making process and failures in the culture surrounding Challenger. We can only hope that current human spaceflight leaders in government and commercial space learn from those lessons. Highly recommended for anyone in the human spaceflight industry.