Jump to ratings and reviews
Rate this book

The Ostrich Paradox: Why We Underprepare for Disasters

Rate this book
" The Ostrich Paradox boldly addresses a key question of our Why are we humans so poor at dealing with disastrous risks, and what can we humans do about it? It is a must-read for everyone who cares about risk."
—Daniel Kahneman, winner of the Nobel Prize in Economics and author of Thinking, Fast and Slow

We fail to evacuate when advised. We rebuild in flood zones. We don't wear helmets. We fail to purchase insurance. We would rather avoid the risk of "crying wolf" than sound an alarm.

Our ability to foresee and protect against natural catastrophes has never been greater; yet, we consistently fail to heed the warnings and protect ourselves and our communities, with devastating consequences. What explains this contradiction?

In The Ostrich Paradox , Wharton professors Robert Meyer and Howard Kunreuther draw on years of teaching and research to explain why disaster preparedness efforts consistently fall short. Filled with heartbreaking stories of loss and resilience, the book


•How people make decisions when confronted with high-consequence, low-probability events—and how these decisions can go awry
•The 6 biases that lead individuals, communities, and institutions to make grave errors that cost lives
•The Behavioral Risk Audit, a systematic approach for improving preparedness by recognizing these biases and designing strategies that anticipate them
•Why, if we are to be better prepared for disasters, we need to learn to be more like ostriches, not less


Fast-reading and critically important, The Ostrich Paradox is a must-read for anyone who wants to understand why we consistently underprepare for disasters, as well as private and public leaders, planners, and policy-makers who want to build more prepared communities.

132 pages, Paperback

Published February 7, 2017

70 people are currently reading
658 people want to read

About the author

Robert Meyer

152 books4 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
44 (22%)
4 stars
84 (42%)
3 stars
56 (28%)
2 stars
15 (7%)
1 star
1 (<1%)
Displaying 1 - 28 of 28 reviews
Profile Image for Michael Burnam-Fink.
1,725 reviews304 followers
July 6, 2017
Meyer and Kunrether synthesize years of experience to develop a model for why individuals and organizations consistently fail in the face of crisis, and offer some brief policy recommendations. Their work is anchored in cognitive psychology, particularly Kahneman's Thinking Fast, and Slow. System 1 is fast and intuitive, but prone to serious cognitive biases. System 2 is deliberative, but requires good information lacking in a crisis. Six cognitive biases in particular, prevent preparedness for disasters.

To quote:
1) Myopia: A tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments--especially the hyperbolic discounting of upfront costs now vs harms years in the future.
2) Amnesia: a tendency to forget too quickly the lessons of past disasters.
3) Optimism: a tendency to underestimate the likelihood that losses will occur from future hazards.
4) Inertia: a tendency to maintain the status quo or adopt a default option when there is uncertainty about the benefits of investing in alternative protective measures.
5) Simplification: a tendency to selective attend to only a subset of relevant factors when making choices involving risk.
6) Herding: a tendency to base choices on the observed actions of others.

In this short volume (125 pages, including notes), they run through the biases and case studies, with a light breezy tone reminiscent of top level journalism rather than the bludgeoning of an academic paper. They suggest a process they deem a Behavioral Risk Audit to meet each of the biases head on. For example, low probability risks can be compared to driving in sunny and snowy conditions, or risk of a disaster can be stated over 25 years rather than annual. The Ostrich Paradox is a little light, but a great introduction to a new angle on risk assessment and management.
Profile Image for Pete.
1,105 reviews78 followers
April 25, 2020
The Ostrich Paradox (2017) by Robert Meyer and Howard Kunreuther looks at why the authors believe people are so poor at handling disastrous risks. Meyer is a professor of marketing at Wharton and Kenreuther is a professor of decision making also at Wharton. 

The book uses a framework of  Kahneman and Tversky's idea of how we think fast and slow, with instinctive reasoning and then with better thought out slower reasoning as well. Meyer suggests that when things go badly we think fast but not well and provides a few examples of this, with the pilots of Air France flight 447 that crashed in the Atlantic and examples of people getting into trouble fleeing hurricanes. 

Meyer suggests six reasons why people don't prepare enough. They are myopia on short term benefits, Amnesia by forgetting about fairy recent history, over optimism, inertia, simplification and herding. 

The book then concentrates on US hurricanes and their impact. 

Unfortunately the book doesn't look at trends in deaths globally from natural disasters. Nor does it mention how much safer air travel has become. Also, no mention is made of US laws that effectively shift much of the risk on to the government with repeated bail outs of home owners with property which is known to be high risk. 

The book is topical with the outbreak of Covid 19 across the world. However, while undoubtedly people haven't been well prepared some countries, notably Australia, South Korea and New Zealand have handled the outbreak much better than others. This would suggest that good institutions are able to handle disasters much better. However Australia's recent poor handling of fire risk would indicate that the same government can sometimes handle one crisis well and another poorly. 

The Ostrich Paradox isn't a bad book. It's impressively short and is worth the time to read it. But in part by being so short and trying to fit many events into the same framework quite a few things are missed. 
Profile Image for JHM.
594 reviews66 followers
October 6, 2017
Insightful and easy to digest book about the patterns of cognitive bias which commonly prevent both individuals and organizations from being able to accurately understand risks and deal with them effectively. It's main theme is emergency preparation, but it also draws examples from the world of finance -- the other place where we are commonly faced with the need to understand risk.

I liked it because it didn't go too in-depth about the causes of the biases: no long lectures on neurobiology. There's just enough information, and it's illustrated by factual stories. On the other hand, I didn't give it five stars because I think the authors could have gone into a bit more detail about how individuals can manage and mitigate the biases.

This is a book to keep and review on a regular basis to help keep one's own evaluation process balanced, especially if you are concerned about emergency preparations but find it difficult to actually invest in them and/or take action.
Profile Image for Trey Shipp.
32 reviews8 followers
April 3, 2017
This short, well-written book uses vivid stories from hurricanes, earthquakes, fires and floods to show the mental biases that keep us from preparing for disasters, even disasters we know are coming. The authors apply “Kahneman and Tversky” psychology to improve our behavior.

While most of the examples in this book come from natural disasters, these mental errors contribute to many low probability but high consequence catastrophes. I couldn’t help but think about errors investors make in the stock market. Myopia: focusing on too short a time horizon; Amnesia: forgetting lessons of past disasters; Herding!

The second half of the book offers strategies to overcome these biases, but this area needs more work. The authors present a framework for a behavioral risk audit and describe preparedness plans that take our natural biases into account. I wish they had more examples of successful preparation to share. If planners read this book, maybe those will come.
Profile Image for Sandra L. Ray.
201 reviews1 follower
February 13, 2024
A little dry and did not share earth shattering new insights…rather, it reinforces what I have learned after years doing disaster related work. A good read … or better yet, get the audiobook. It’s a nice listen.
Profile Image for Chris Boutté.
Author 8 books282 followers
December 30, 2020
I wasn't sure what to expect from this book, but it was surprisingly good. As someone who loves reading about cognitive psychology and the thinking errors we're all prone to, this was a great book looking at these issues from a different angle. In this book, Meyer and Kunreuther start each chapter discussing different disasters throughout history, as well as some everyday risks, and why people ignore them. While many of us won't be in these types of situations, it's great to get a look at what biases and brain flaws we have when it comes to risk assessment and decision making. 

I think the best part about this book was that it was short and to the point. The chapters aren't very long, and the authors do a great job of not only explaining the subject of each chapter, but they're able to tie in similar cognitive science aspects with each topic. For the length of the boo, they managed to pack in a lot of information in a very digestible way.
1,004 reviews1 follower
March 9, 2018
The Ostrich Paradox: Why We Underprepare for Disasters by Robert Meyer is a look at why humans have trouble looking long term at the worst possible outcomes. It only takes one thing and you are in trouble but if we prepare at all, it is not enough. I liked the stories that illustrated the points. he authors talk about biases and developed Behavioral Risk Audit so you can met things head on . It is enough information to get you to think about how prepared are you? It is well written and short enough for most people to read to the end. I highly recommend this book.

I received a copy thru a Goodreads Giveaway.
Profile Image for Pamela Merritt.
48 reviews11 followers
October 11, 2017
This was a good outlining of the mental traps people fall into that result in them doing stupid things. Since I am a student of stupid things, I found it fascinating.

It is apparently very difficult for humans to take long-term, expensive, steps to prevent something with what looks like (in the moment) a low likelihood of happening. Lays it all out with some real-life examples. Highly recommended!
Profile Image for A.L. L. Wicks.
Author 2 books3 followers
April 1, 2021
A really good look at human nature from the perspective of taking risks (or not), long-term strategical thinking (or short-term non-strategical non-thinking)...
Okay, so maybe I’m being a bit tongue-in-cheek, but I learned more from this book than I would have thought. It’s a fairly short read—half an afternoon—and I greatly enjoyed it (enjoyed may not be *quite* the right word) while fixing up a wooden bunk bed.
The narrator was great—he had a very nice voice to listen to. 😊
114 reviews5 followers
May 26, 2021
The Ostrich Paradox offers insight in the biases that make us underprepare, underestimate, and underperform when confronted with risk(s). It's a quick read, and depending on your knowledge about behavioral science either a solid refresher course or a solid introduction to the field. The book has sufficient explanatory power, but I found the part on how to tackle these biases somewhat underdeveloped.
Profile Image for Judi.
795 reviews
February 7, 2019
Understanding improves preparation

Provided an excellent breakdown of biases that affect preparation activities along with several strategies to overcome those biases. All municipal planners and leaders should read and understand this text to make better decisions dealing with disaster preparation.

Received as a Kindle book through a goodreads giveaway.
Profile Image for Wentao Lyu.
9 reviews
May 3, 2025
Interesting way of looking at human decision making process and the correlation between the larger disaster control and human biases. Thinking from a more bottom-up perspective from individual level using the 6 psychological traits instead of from the traditional top-down scientific reasoning behind each and every disaster in designing disaster prevention protocols is insightful.
Profile Image for Tom.
195 reviews2 followers
August 16, 2017
Simple thesis: "individuals, communities, and institutions often underinvest in protection against low-probability, high-consequence events." True story. This book makes the case well with interesting examples.
Profile Image for Helfren.
941 reviews10 followers
April 30, 2020
People are obsessed with hyperbolic discounting. The payment of 100 dollar in a month vs a 120 dollar in a year, most will choose the 100 dollar in the month. Meanwhile, banker sees money of 120 dollar of higher value than the 100 dollar one.

Interesting book about human deeper psychology.
Profile Image for Scott.
263 reviews12 followers
June 9, 2020
The rating reflects that this book is very US centric in its examples and utilising 1 example throughout.

The principles of bias are very valuable and the book has some good thoughts for how insurers can play a larger role in risk management.
32 reviews
March 11, 2022
Good layman's read on behavioral biases. The author is clear and concise in his writing. If you like this book, I would recommend: Think Fast and Slow by Daniel Kahneman, Predictably Irrationality by Dan Ariely and Nudge by Richard Thaler.
14 reviews
October 20, 2017
PRETTY GOOD BOOK

I enjoyed this book and although I don’t agree with a small portion I learned a lot from it. I took notes to help me remember important points.
814 reviews
March 17, 2020
Reminded me a lot of another book I've read before. Ultimately a book about how we "make" decisions and the inherent biases and shortcuts we take.
3 reviews
January 5, 2021
It is a must for disaster risk managers. Policy makers. Flood risk students and professional. Very insightful reading.
Profile Image for Angela.
551 reviews
July 8, 2021
I gained some valuable info on human biases that prevent us from preparing for emergencies, but the audience of this book is policy makers.
Profile Image for Breanna.
93 reviews1 follower
October 20, 2022
Slow and dull but with some interesting information. Definitely pre-COVID
36 reviews1 follower
November 24, 2024
An interesting read with a few case studies to emphasize points. Many of the examples were dated however.
Profile Image for Belal Dahab.
43 reviews19 followers
July 13, 2025
This was a good read, and it makes a call to action to invest in preventative measures towards disaster mitigation
Profile Image for Melissa.
778 reviews17 followers
September 12, 2021
~Disclaimer: I received a free kindle copy of this book.~

The thesis of the book in the author's own words:

“The core thesis of the book is that, much in the same way that ostriches are limited in their defensive actions because they cannot fly, we need to recognize that when making decisions, our biases are part of our cognitive DNA, and are keeping us grounded, flightless. Still, we might be able to design and structure a suite of choice environments, incentives, and communication methods that allows human decision makers to overcome these biases when faced with future hazards. We suggest that we need to learn to be more, not less, like ostriches—hence the paradox—if we are to be better prepared for disasters.”

I think if you are looking for a quick book on this topic that a layman can understand, this book is a good one to pick up. If you already know a lot about this topic I doubt this book will serve much purpose.

The breakdown of the book makes it easy to reference. Part one is about the why of our failures to prepare: cognitive systems, myopia bias, amnesia bias, optimism bias, inertia bias, simplification bias and the herding bias are addressed in easy to read chapters with recaps at the end. Part two makes suggestions on how to prepare in the future:
behavioral risk audit to identify and come up with recommendations to overcome biases
large scale actions like legislation/policy to make the changes needed and create better decision making (e.g. tax incentives, buyouts, building codes, etc.)

“Our key argument is that there is a way out, but the path there carries a certain paradox: As individuals and planners, we need to be more like ostriches, not less. That is, in the same way that the ostrich has adapted its behavior to take into consideration its physical limitations, we humans, when thinking about risk, need to develop policies that take into consideration our inherent cognitive limitations.” (pg.115)
Displaying 1 - 28 of 28 reviews

Can't find what you're looking for?

Get help and learn more about the design.