Jump to ratings and reviews
Rate this book

Denying to the Grave: Why We Ignore the Facts That Will Save Us

Rate this book
Why do some parents refuse to vaccinate their children? Why do some people keep guns at home, despite scientific evidence of risk to their family members? And why do people use antibiotics for illnesses they cannot possibly alleviate? When it comes to health, many people insist that science is wrong, that the evidence is incomplete, and that unidentified hazards lurk everywhere.

In Denying to the Grave, Gorman and Gorman, a father-daughter team, explore the psychology of health science denial. Using several examples of such denial as test cases, they propose six key principles that may lead individuals to reject "accepted" health-related wisdom: the charismatic leader; fear of complexity; confirmation bias and the internet; fear of corporate and government conspiracies; causality and filling the ignorance gap; and the nature of risk prediction. The authors argue that the health sciences are especially vulnerable to our innate resistance to integrate new concepts with pre-existing beliefs. This psychological difficulty of incorporating new information is on the cutting edge of neuroscience research, as scientists continue to identify brain responses to new information that reveal deep-seated, innate discomfort with changing our minds.

Denying to the Grave explores risk theory and how people make decisions about what is best for them and their loved ones, in an effort to better understand how people think when faced with significant health decisions. This book points the way to a new and important understanding of how science should be conveyed to the public in order to save lives with existing knowledge and technology.

328 pages, Hardcover

First published August 1, 2016

98 people are currently reading
1735 people want to read

About the author

Sara E. Gorman

2 books5 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
80 (32%)
4 stars
93 (38%)
3 stars
60 (24%)
2 stars
7 (2%)
1 star
3 (1%)
Displaying 1 - 30 of 46 reviews
Profile Image for Shaun.
Author 4 books225 followers
April 17, 2017
This is a book that explores why people ignore science. The conclusion can be summed up in four words, because we are human. The book and its authors also attempt to offer solutions for getting people to listen to science, a task that seems like it should be relatively simple, but that really isn't.

One of my favorite non-fiction topics is the biology of belief and behavior, and this books compliments other things I've read.

Although many of the ideas offered here, as well as a good chunk of the research provided, were not new to me, the overall presentation still got me thinking more deeply. For example, the idea that common human biases like confirmation bias probably have an evolutionary basis that has ultimately served us as a species but that can also impair our ability to critically evaluate facts when making important decisions is certainly not new to me. However, the idea that presenting someone with facts contrary to their flawed reasoning, not only does not convince them but may also further alienate them was something that I hadn't seen explored in any depth before.

Often, we make emotional decisions that more than not have nothing to do with facts or reason. It's part of what makes us human. On the flip side, our evolved brains allow us to discriminate and critically analyze the world around us. Unfortunately, our emotional brains are more likely to play a role in the formation of beliefs. Once formed, beliefs are extremely hard to let go of. Even more importantly, there is likely a biological basis for this inflexibility.

The authors also do a good job of exploring how this inability to evaluate topics critically is a universal trait that doesn't necessarily reflect intelligence or the ability to comprehend the available science, but has more to do with how beliefs are generally formed and reinforced. Interestingly, the authors touch on some topics that I have been interested in for a while. For example, I've always thought it was strange that some of the same people who demonize GMOs and vaccines are also the same people who promote the devastating effects of climate change. In the case of the latter, they reference the overwhelming scientific consensus in support of their views. However, when it comes to GMOs and/or vaccines, they completely ignore the scientific consensus. I'm also puzzled how pro-gun advocates think that owning a gun makes them safer when research shows time and time again that this is not true. Or why we fear relatively benign threats like terrorists when the reality is that there are thousands of more imminent threats to our well-being that are either ignored or not given the same level of attention. So what gives?

This book is an exploration of the "what gives?"

Some interesting quotes and tid bits for those interested:

It would be comforting to believe that polarized positions would response to informational and educational programs. Unfortunately, psychological research demonstrates that people's beliefs change slowly and are extraordinarily persistent in the face of contrary evidence. Once formed, initial impressions tend to structure the way that subsequent evidence if interpreted. New evidence appears reliable and informative if it is consistent with one's initial beliefs contrary evidence is dismissed as unreliable, erroneous or unrepresentative.


In reference to antivaxxers:

Where do they get their "science" from? From the Internet, celebrities, other parents, and a few non-mainstream researchers and doctors who continue to challenge the scientific consensus, all of which forms a self-reinforcing echo chamber of misinformation.


A passage that I think explains the "Trump Phenomena"

To further compound the problem, charismatic leaders almost never speak to our rational sides (remember the MAKE AMERICA GREAT AGAIN slogan???). Part of what makes them such successful, magnetic leaders is their preference for emotional persuasion. Any CEO knows, and any basic leadership course will teach you, that immediately appealing to people's rational brains will not get you very far. Not only will people not like you but your company will actually not function as well as it would if your employers were encouraged to think emotionally and to see the bigger picture.


In all fairness, Obama's slogan, "HOPE AND CHANGE" was also an appeal to our emotional side.

Also...

...charismatic leaders, actually distract us from the specific issues at hand...and moves us to a world of platitudes and universal concerns that rouse our emotions without taxing our analytical skills


and

Charismatic leaders induce brain changes that first heighten the fear centers of the brain, like the amygdala, and then suppress the decision-making areas in the PFC.


Yogi Berra quote that seems to summarize confirmation bias:

I wouldn't have seen it if I didn't believe it.


"Affect," writes Paul Slovic, the eminent behavioral psychologist, "is a strong conditioner of preference." This is why politicians know that they need to get their audience emotionally arouse before trying to convince them.


There is an old adage: "What you don't know can't hurt you." In the science denial arena, however, this adage seems to have been recrafted to something like: "What you don't know is an invitation to make up fake science."


People are not comfortable with observing phenomena in their environments that cannot be explained. As a result, they come up with their own explanations....In other words, people prefer sequences, and they will go to great lengths to rid their environments of randomness.


On how difficult changing our minds can be:

The powerful dorsolateral prefrontal cortex can override these more primitive brain centers and assert reason and logic, but it is slow to act and requires a great deal of determination and effort to do so. Hence, it is fundamentally unnatural and uncomfortable to change our minds, and this is reflected in the way our brains work.


Science demands that we be open to changing our minds constantly, but human biology and psychology insist that we hold onto our beliefs with as much conviction as we possibly can. This conflict is fundamental to our reluctance to accept new scientific findings. Once the brain has set up the idea that GMOs cause cancer, it is basically impossible to undo that belief, no matter how many scientific studies provide evidence to the contrary.
Profile Image for Jessica Meats.
Author 16 books33 followers
May 13, 2016
Denying to the Grave is a book on a very interesting concept: why do people believe things even when there is a wealth of scientific evidence to the contrary? The book talks about a lot of general principles, but repeatedly turns to a few specific cases to demonstrate these principles in action. These cases include:
- The belief that vaccines cause autisms
- That owning a gun makes your safer
- That GMO foods are dangerous
- That antibiotics are useful in treating viral infections
- That nuclear power is extremely dangerous
- That drinking unpasteurised milk is better for you than drinking pasteurised milk

The authors look at the ways in which people start to believe these sort of things, and then they delve into some of the psychological principles that explain why those beliefs stick with us. They look at how much easier it is to remember emotionally-charged information (so reading stories about children getting ill is more likely to stay with you than dull statistics). They look at things like conformation bias (noticing and being drawn to evidence that supports our existing beliefs), group dynamics (automatically siding with members of your group against perceived outsiders), and the fact that it takes a lot more evidence to change our minds than to form an idea in the first place.

One thing I found particularly interesting was that they used a counter example to demonstrate that we can't just dismiss conspiracy theories off-hand. They talked about the tobacco industry and how the companies involved in it were able to, for years, threaten scientists, fudge data, and promote misleading reports in order to dismiss claims about the harms of smoking. By pulling out this example, the authors make it clear how difficult it can be to know which side of a scientific debate is true until you look carefully at the supporting evidence.

Speaking of evidence, throughout the book, the principles being discussed are backed up by evidence, with references to studies and scientific reports. Despite this, there are a few points where the authors speculate on how certain belief-forming approaches might have developed in humans. Given the rigorous supporting of their facts in other places, these occasional divergences into speculation stood out.

I also found it strange the examples that were being used. The book is framed as being about health concerns - so things like gun ownership stood out as not quite following these theme. I admit that getting shot is serious health concern, but compared to topics like the GMO food or the autism-vaccine connection, this point didn't seem to fit with the others.

In terms of the subject matter, I found the book very interesting, but in terms of style, I found it repetitive in places. Some arguments and points would come up in one chapter and then be discussed again in a later chapter. Sometimes, the authors would explain a scientific point or principle, and then explain it again in slightly different wording in the next paragraph. If you're new to some of the principles of psychology and statistics that are discussed in this book, this might be useful to you, but for me it detracted from my enjoyment.

The book also delves into some quite detailed statistics in places. This is important for proving the point about evidence, but it also serves to demonstrate another point the authors make: statistics make dull reading. These were not the most interesting parts of the book.

On the whole, I enjoyed the book. If Goodreads allowed for half stars, I'd put it at 3 1/2. I think it's an important and interesting subject matter, but the style of the book made it a little dull in places.

I received an advanced copy of the e-book in exchange for an honest review.
Profile Image for Stefan Mitev.
167 reviews704 followers
April 16, 2022
Съществуват комплексни психологични, социологични и невробиологични причини за отричането на науката, която спасява животи. Пандемията ни го показа. Много хора съзнателно решиха да не се ваксинират и изборът им се оказа фатален. Но какво е обяснението за наблюдавания масов феномен на отхвърляне на науката?

В книгата "Отричане до гроба: Защо игнорираме науката, която ще ни спаси" са изброени шест основни причини, водещи до грешни решения. Някои от тях са ни до болка познати, но си струва да бъдат повторени отново: 1) Конспиративни теории, 2) Харизматични лидери (шарлатани), 3) Склонност към потвърждаване (confirmation bias), 4) Неразбиране на причинно-следствени връзки, 5) Избягване на комплексни отговори и несигурност, 6) Неправилна оценка на риска.

Обществото ни е в криза. Сега разполагаме с многобройни лекарства, ваксини и процедури, които ще ни позволят да живеем по-дълго и по-качествено. Но много от нас избират да вярват в "алтернативна гледна точка" и сами си "преценят" кое е най-доброто за тях. Краят понякога се оказва фатален. Комуникаторите на науката трябва да разбират трудностите при предаване на сложна информация на широката публика. Например все по-важно става да бъдем активни в интернет и социалните мрежи, където става основното разпространение на дезинформация. Трябва да отделим време и сили да обясним на разбираем език комплексни теми като статистическа вероятност, риск и причинност. Информацията сама по себе си обикновено не е достатъчна. Доказано е, че вярващите в конспиративни теории не страдат от прост "дефицит на информация". Човекът с вече формирано мнение е труден човек за промяна. Ето защо става все по-важно да се борим с конспиративните теории още преди формирането и масовото им разпространяване. Тоест ние трябва да ги предвидим и оборим още преди да бъдат наложени в обществото, да "ваксинираме" околните срещу очакваната дезинформация, ако щете.

Прочетете книгата, ако се вълнувате от психология на убеждаване, невронаука или научна комуникация или ако искате да помогнете на близки хора, които са жертва на дезинформацията.
Profile Image for Art.
551 reviews18 followers
December 16, 2016
The outcome of last month’s election taints the reading of current social science books, such as those that examine truth and reason as well as fact and science. The incoming fact-impaired and anti-science administration fails to understand these virtues. And that makes it all the more urgent to read these books and glean what lessons we can. In time, a new crop of books will explore science and reason in the new administration.

Denying to the Grave, written by a father and daughter, examines why we ignore facts. This highly quotable book generated three dozen index cards of notes.

“We are battling a mania whipped by reactionary politicians,” writes Charles Blow to open the thirty-four page introduction. This book published in September.

Each chapter examines a key driver of science denial, with the strongest one about charismatic leaders, such as the person who will become president, whose image colored my reading of this book.

What causes otherwise intelligent people to make decisions and adopt positions with no basis in fact? It begins in school, where we need to shift from memorizing leaves and rocks and toward how science works by learning how to accept uncertainty, how to become skeptical and how to ask the right questions.

Modern anti-science conspiracy theories include the anti-vaccine movement, the debunked concerns over GMO in food, and the unproven need to keep a gun at home. For example, a third of Americans keep guns at home for self-protection despite incontrovertible facts that guns make the owner and family less safe. Gun owners exaggerate the risk, write the authors, while denying the risk of suicide or homicide at home that the guns introduce.

Journalists, meanwhile, need to learn that fairness differs from balance. Facts stand. Wrong-headed views may deserve a mention but not balance with the truth. The authors aimed this book at several audiences, including science journalists and scientists.

Followers of anti-science conspiracy theories typically include people with low self-esteem as well as those who feel powerless and distrust authority, write the Gormans. Proponents of conspiracies include chagrined scientists who form organizations with long and provocative names. They emphasize motive over outcome, emotional over rational.

Conspiracy theories need charismatic leaders who can elicit strong emotions. Gandhi, Nelson Mandela and Martin Luther King stand as persuasive and charismatic leaders for good. But anti-science charismatic leaders often represent breaks from established social systems.

Resisting persuasion requires self-awareness and critical thinking. School kids need to spend more time on debate, taking the counterargument. Teach them about rhetoric, which brings to mind the five-star book Enough Said: What's Gone Wrong with the Language of Politics?, by Mark John Thompson.

The scientific method, as an example of deductive reasoning, can take a long time to unfold. It begins with a hypothesis, then experiments, data analysis then a conclusion. Fifth-grade science needs to include the scientific method, write the authors. Teachers could ask students, for example, how we decide something is true or how we know if a medicine really makes us better.

Charismatic leaders can succeed because they appeal to emotion, and people respond more to emotion than to statistics. Scientists use numbers. But when it comes to sharing science, a serious tone competes with the emotional ones.

The scientific community does not respond in kind to anti-science charismatics. As a result, many people find themselves undecided in the middle. To develop scientific beliefs, we need to form groups around the evidence and look for credible scientists with charisma.

After making their case, the authors wrap it up with solutions, including these:

— Science must engage with social media. Science and medical societies would gain understanding through an active social media strategy that refutes the conspiracies. Science needs to take action before wrong ideas become uncorrectable. Science, in the modern media era, needs to respond immediately to incorrect information.

— Writers and editors need to know the difference between valid and invalid scientific debates. Journalists need to stop presenting issues as controversial. The science and the facts do not require balance against wrong ideas.

— Science needs its own charismatic leaders who know how to share facts that appeal to the intellect and emotion.

— We need better education about statistics, good evidence and critical-thinking skills. Assign students a topic with conflicting data. Teach kids to think factually, like scientists.

Four publications conduct the most rigid peer review, write the Gormans: Science, Nature, The New England Journal of Medicine, Proceedings of The National Academy of Sciences.

Oxford Press published a well-grounded book of the highest order. Thirty-two pages of notes, in text the same size as the body of the book.

Bibliotherapy helps me to recover from the outcome of last month's election. Truth and facts, science and reason will take a hit in the new administration. This book helps explain how we got here.
Profile Image for Matt Hunt.
671 reviews13 followers
July 1, 2022
It all boils down to humans are emotional, social, random animals, not rational, reasonable predictable models.

I'm giving this two stars not because I disagree with anything they had to say, I just found it a bit dull. I wanted some new insight and thought from it but I'm left in exactly the same place. Except I know another book that says lots of things I agree with but hasn't helped my thoughts develop any further.

They probably deserve more than two stars. But fuck it, I'm being miserable today.
1 review
November 15, 2016
I am in the process of reading this book but found some things that disturbed me right off the bat:
1. I'm reading book in November, 2016 but it's copyright is 2017.

2. Oxford University Press seems to indicate a "status" of veracity not deserved especially since date is more recent than book was actually published indicating a need to make the "research" seem more current. Haven't most scientific discoveries happened w/ random people investigating observations/theories in their own private labs w/out being paid money for their discoveries?

3. Authors, especially Ms. Gorman, has multiple letters behind her name as if that makes her
an "authority," "expert," "qualified to comment and judge" on issues she and her father discuss
in the book.

4. Page 3: "Time after time, we make decisions about the health of ourselves and our families based on emotion rather than on analysis of the scientific data. The government uses taxpayer money to mount hugely expensive studies in order to prove that the science is correct, but people come up with more and more reasons to insist that the research is incomplete, biased, or simply wrong."

The authors do not seem to understand the basic idea of "null hypothesis" and that science requires one who has an hypothesis, to test the "null hypothesis" ie. that what they believe is actually false. Thus the government using taxpayer money to mount hugely expensive studies in order to prove that the science is correct is why so many common people w/out advanced degrees are making decisions for themselves based on their own experiences and their own analysis of scientific data using much simpler, truer, statistical analysis.

5. Page 5: "In both cases scientific evidence strongly suggests a position - vaccinate your child and don't keep a gun at home - that many of us choose to ignore or deny. We make a distinction between "ignore" and "deny," the former indicating that a person does not know the scientific evidence and the latter that he or she does but actively disagrees with it. We quickly generated a list of several other health and healthcare beliefs that fly directly in the face of scientific evidence and that are supported by at least a substantial minority of people:
a. Vaccination is harmful.
b. Guns and protection.
c. HIV is not cause of AIDS.
d. Nuclear power is more dangerous than fossil fuels.
e. Antibiotics are effective treatments of viral infections.
f. Unpastuerized milk is safe.
g. Electroconvulsive Therapy (ECT) is unsafe/ineffective."

Issues discussed by authors appear to be of special interest to the authors and/or their relatives and were not randomly selected.
The authors do not discuss potential "bias" they may have in "disproving" ideas being generated
by common people w/ out advanced degrees whom they seem to be discrediting.
Do the authors and/or their friends and relatives hold stock in companies that may profit from the opinions, dressed as research, they espouse?
Are the authors and their friends/family currently working on projects where they receive money for their views one way or the other?

6. Page 37: "On the other hand, Dr. Wakefield's conspiracy is fantasy. The scientific evidence that vaccines are safe and effective is overwhelming and there has never been any need for pharmaceutical companies to engage in secret activities to promote them. Doctors know that vaccination is an evidence-based practice and lots of people who have no financial stake in the vaccine industry - like the entire world of public health officials - highly recommend vaccination. In short, people die from smoking cigarettes and lives are saved by vaccination."3.

The authors' use of undefined terms/no footnotes to qualify various terms like :
a. "hugely expensive"
b. "overwhelming"
c. "evidence-based"
d. "lots"
e. "conspiracy theories"
relies on emotional based words w/ out defining them. This is what the authors accuse "charismatic leaders" of "conspiracy theories" of doing.

Just thoughts based on my experiences for what their worth.
Profile Image for Mark.
190 reviews13 followers
June 11, 2016
Cognitive decision-making processes that worked reasonably well for humans in a less complex environment are now obstacles in processing highly complicated and, often, ambiguous issues.

Denying to the Grave explores the many neurological, social, and psychological reasons why every one of us can accept something that is false, and hold on to it even in the presence of compelling and overwhelming data against a belief.

In the span of six chapters, each fairly long, Sara and Jack Gorman tack specific areas that contribute to our cognitive dysfunctions when it comes to having to think about, process, and make decisions about data and facts as they pertain to science and research. The Gormans use a number of examples throughout the book to illustrate how common reasoning processes by people, including scientists, can lead to wrong conclusions. These examples include vaccines, GMO, gun ownership, and nuclear power. Even if you don't agree with some of their positions, don't let their specific points of view on these deter you from understanding how the broader cognitive and emotional functions combine to lead people toward decisions.

Throughout these chapters the Gormans discuss how and why our brains default toward shortcuts (thinking about things is highly energy and time expensive), why our decisions are so tied to emotions (empathy helps us get along with larger society), why we prefer the specific over the abstract (stories and anecdotes are easier to grasp than statistics), and why these so often lead to wrong decisions when it comes to issues that require greater thought and processing. They describe how initial impressions get stuck by acting like a opioid drug in our brains: when our held beliefs are affirmed, they activate our brains' pleasure centers. Conversely when we hear conflicting data, our fear responses are triggered, meaning changing our minds is an uphill battle and that the reward for the change must be greater than holding on to current beliefs.

The concluding chapter ("Conclusion") lists several guiding principles and solutions that we ought to take for ourselves and influence adoption in the society at large, so that future generations will better understand the nuances of science and be more open to reviewing past beliefs and changing their minds when the data warrants.

The Gormans admit that some people will never change their minds on some issues, in spite of the data and facts, and how much the suggestions provided in this book are employed. This book is not for these kinds of people.

This book is for two broad groups: 1) those who are already very science-minded and want to learn how to communicate their positions more effectively; and 2) those who may not be as knowledgable and educated in science, who may be on the fence about some or many issues, but are at least open to some degree to understanding issues and making better decisions through rational thinking.

I think this book is very valuable to all who want to understand better why they think the way they do, why there are people who don't think the way you do, and to know how and when to communicate our beliefs and positions. Although this book deals specifically with issues of science, I think many of the principles apply to and can be borrowed by other fields.

(Advance Reading Copy provided by published through NetGalley.)
Profile Image for Hector  López.
69 reviews2 followers
December 6, 2019
Human nature, so complicated!

Sara Gorman evidences it, but she also insists on the rational behavior value (whatever this implies), no matter how hard it was. In this case, in epidemiological environments and other controversial topics related to health.

However, it allows me collaterally, to understand more the politician's strategy and the scientist's convicing limitations.

Excelente lectura, redacción certera, se percibe la academia por detrás, aunque nada que no supiera antes y con frecuencia redundante.
Profile Image for Ann.
52 reviews
May 22, 2016
In "Denying to the Grave," authors Sara and Jack Gorman have created a book that, frankly, I wish everyone would read. As an undergraduate and later a graduate student, I took a number of courses in psychology and public health, so much of the material presented (on cognitive biases, persuasion, group dynamics, and so on) was already familiar to me. Even so, I came away from this book with a deeper understanding of these topics, and perhaps more importantly, a sense for how I might apply this information to my own life. Having spent most of my working life in research jobs, I have great respect for scientific reasoning and the scientific method. But I am often dismayed by how little others appreciate and understand science and how it works. Too often I see friends and acquaintances (even those I've always considered intelligent) fall prey to pseudoscience and fail to apply basic scientific reasoning. From what I've read in this book, I can now see why my efforts to educate them may have failed and what methods might be more successful in the future. All that said, this is not just a book for academics and scientists. The content of this book would be helpful to anyone who comes into contact with scientific information in their daily lives--and let's face it, we all do. Although unfortunately the writing is dry in the way most academic writing is, concepts are presented clearly, such that even a lay audience would not have too much trouble following along. This book would also be a great resource to professors and instructors of psychology trying to cultivate critical thinking in their students--many examples discussed in the text could be incorporated into lectures and in class activities. Though I fear this book may not be enticing enough to lay readers, I hope that some will have the patience and take the time to read this book.

**I received an advanced review copy of this book from NetGalley and Oxford University Press in exchange for an honest, unbiased opinion.**
Profile Image for Dan Connors.
369 reviews41 followers
September 21, 2023
"Two flat earthers die and go to heaven. At the pearly gates they have the chance to ask god any questions they want and get truthful answers, so one flat earther asks god "is the earth flat?" to which god answers "No."

The flat earther looks at the other and says "this goes higher than we thought"." Unknown


Why, in an age where we have a wealth of reliable, scientific information about a host of things that have been studied again and again, are we so lost in misinformation as to be worse off than we were decades ago? Why are there certain things that we cling to, ignoring all conflicting data, rather than admit we were wrong? This is one of the most troubling dilemmas of the information age. So much information, and so much of it WRONG.


Psychologist Sara Gorman and her father Jack Gorman write about this problem in their excellent book, Denying to the Grave. The Gormans are very much on the side of science, and they spend hundreds of pages trying to get inside of the minds of science deniers, especially those who are skeptical of modern medicine.


Bottom line- humans are emotional creatures who once in a while use their cerebral cortex to think critically, but not that often. Thinking is hard, slow, and sometimes frustrating. We approach the world much the same way our ancestors did thousands of years ago- suspicious and fearful. I've always wondered how fantastical conspiracy theories can get so much traction and cause so much damage. In the minds of so many, shadowy elites are trying to control us through vaccines, GMO's, climate change, and gun control, and scientists are somehow in on it.


This strong emotional response, ruled by the amygdala that processes information with an eye for danger, can convince us that almost anything is an existential threat. And once that belief is embedded in there, confirmation bias makes it stronger and stronger by only taking in information that confirms what we already believe. Conflicting information is discounted out of hand as unreliable and probably a product of the shadowy elites.


Depressingly it turns out that many of those who fall for conspiracy theories and science denial are educated, intelligent people. Giving them new information that disproves what they believe not only doesn't work, but tends to make them double down and believe even more strongly in conspiracies.


Our brains don't appreciate complexity or ambiguity. Science can never provide 100% certainty, and that tiny kernel of doubt explodes into conspiracies, as we all saw during the Covid-19 epidemic. We are terrible at judging probabilities and risks- loading up on high-powered guns while being careless in the bathroom (site of most home injuries and deaths), not wearing seatbelts, or avoiding doctors. We rely on the availability heuristic- the tendency to pay the most attention to things we see or hear about, rather than relying on more reliable statistics, which we see as dry and confusing.


The Gormans expand on the scientific method, which is widely misunderstood, and the foundation of most of what we know. They look at how scientists measure causality, which can be a maddeningly frustrating search, especially when it comes to medical problems. So many people use correlation, when things appear to happen together, as proof of causation. Causation is much more complicated than that and can rely on a number of factors or can be totally random, unfortunately. The reason for much of the autism debate around vaccines centers around the fact that autism often shows up around the same time that children get their first round of immunizations. Scientific inquiries into this connection have disproven any causal connection.


I enjoyed this book because it is so connected with the information crisis that we are currently experiencing. Conspiracy theories are getting in the way of scientific progress and responsible governance. Fearmongers are inciting violence and dividing families, all in the name of getting power and influence for themselves. The Gormans repeatedly single out things like anti-vaxxers, gun owners, antibiotic overuse, GMO's, nuclear power opponents, and anti-pasteurization folks. Today there are dozens of conspiracy theories tearing down the fabric of society, and they are making us all fear each other and the experts who are mostly trying to help us.


The book concludes with some solutions that can perhaps open some eyes to the benefits of science and the dangers of being confidently wrong. They take aim at the media, where reporters are ill-prepared sometimes in presenting valid scientific debates. (Climate change is real and getting worse, not a subject for both sides to be presented). Schools should be better at teaching the scientific method and helping students understand probability, which is a more reliable tool than their own limited experiences. Motivational interviewing, where people are encouraged to question their own assumptions, has been shown to open a crack in anti-vaxx or anti-LGBT sentiments. And we all need to examine our tendency of thinking uncritically and relying on emotions to guide us.


The title of this book grabbed my attention and reminded me of the craziness that was the Covid epidemic. Conspiracy theories were widespread about the pandemic, including fake cures, vaccine fearmongering, and mask intolerance. People were literally willing to die for their beliefs, and to kill loved ones by exposing them to a virus that might or might not put them in the hospital. Online, there were the Herman Cain awards, (named after the famous presidential candidate that died after being exposed at a Trump event), that documented social media posts filled with confident misinformation and taunting, followed by requests for prayers as the posters got sick and hospitalized. Why are people so attached to their beliefs and unwilling to question them when their very lives could be at stake? I still haven't figured that one out.


Covid-19, climate change, and guns don't care if you believe in them or not. They can kill believers and non-believers equally well. We need to listen to more scientists and experts (even thought they can occasionally be wrong), and rely on our emotional judgements less. Science, statistics, and modern medicine can save us from ourselves, if we use them wisely.
Profile Image for Gwen.
166 reviews4 followers
January 17, 2018
Must read for anyone looking to better understand confirmation bias. Good first steps in closing the gap between medical science and poor behavior choices. Underlying dynamics could explain the widening gap between experts in general and the broader community.

I recommend this book to journalists, political science enthusiasts and anyone in the medical field hoping to pave the way toward healthier behavioral choices.
402 reviews4 followers
May 18, 2018
I gave it an extra star for the dire importance of the information and the message of this book. I love reading nonfiction in the social sciences (it's both my job and my preferred form of entertainment), but I got bogged down from time to time. Still, it's worth the read.
Profile Image for Peter McKenzie-Brown.
4 reviews
October 29, 2017
Denying to the Grave: Why We Ignore the Facts that Will Save Us
by Sara Gorman and Jack Gorman
2017; Oxford University Press; Oxford

This book is about the politics of bad science. Here are a few examples.

• Phony science, based on a sample of 12 individuals, created the cult of anti-vaxxers. These are the people who refuse to vaccinate their children against disease because they think it will induce autism in them.

• News stories about outbreaks of the Zika virus, for example, created panic in the US when the disease briefly appeared there. Much of North America began to panic when four US cases of the disease took people’s lives. To put that in perspective, about 2.5 million people die in the US each year – on average, 6,775 per day. The threat was almost infinitesimally small.

• Many Americans, who have easy access to firearms, believe that having guns at home makes them safer even though social science has demonstrated conclusively that it puts their lives at much greater risk.

• In Calgary in particular, there are endless arguments about whether man-made climate change is really real, although the bulk of scientific research has repeatedly confirmed that these phenomena are real.

• Patients insist on antibiotics for viral infections even though viruses are immune to antibiotics. Why is that? Of equal interest, why do doctors prescribe them, knowing they will make no difference?

• Why do conspiracy theorists still believe that many people conspired to kill President Kennedy, when the most exhaustive investigation ever into a single homicide concluded decisively that Lee Harvey Oswald acted alone?

• Genetically modified foods have greatly increased harvests, and have never caused illness. Yet Europeans in particular call them Franken foods, suggesting they are somehow like the Frankenstein monster which Mary Shelley’s book brought into the vocabulary.
In summary, many people insist that science and other kinds of creditable research are wrong. Often based on information provided by charismatic leaders, they argue that they are right because the evidence is incomplete. Unidentified hazards lurk everywhere, making the world an innately dangerous place.

The authors are Sara Gorman, whose specialty is public health, and her father Jack Gorman, an MD and psychiatrist.

As a discussion in Goodreads puts it, in this book the two authors “explore the psychology of health science denial. Using several examples of such denial as test cases, they propose six key principles that may lead individuals to reject “accepted” health-related wisdom: the charismatic leader; fear of complexity; confirmation bias and the internet; fear of corporate and government conspiracies; causality and filling the ignorance gap; and the nature of risk prediction. The authors argue that the health sciences are especially vulnerable to our innate resistance to integrate new concepts with pre-existing beliefs. This psychological difficulty of incorporating new information is on the cutting edge of neuroscience research, as scientists continue to identify brain responses to new information that reveal deep-seated, innate discomfort with changing our minds.”

Those are good comments, but they refer essentially to the book’s conclusion. What that reviewer fails to mention is that the fun part is getting there. This is an enormously intelligent book. It includes well-sourced information from scientific literature, but also commentary from unreliable online sources. As one example, it includes a quotation from actress Angelina Jolie, from an American Firearms website. “I bought original, real guns of the type we used in Tomb Raider for security,” she told a British newspaper. “Brad and I are not against having a gun in the house, and we do have one. And yes, I’d be able to use it if I had to…. If anybody comes into my home and tries to hurt my kids, I’ve no problem shooting them.”

Countless studies have shown that having guns in the house makes people less safe, not safer. As an experiment, the authors signed up as members of a pro-gun website, joined a discussion group, and began posting scientific information about the dangers of gun ownership. Without fail, the response they got was anger. Put another way, the other members of this group were saying “My mind is made up. Don’t confuse me with the facts.” In passing, they note that in the US 10,000 people die from gunshot wounds each year. Another 20,000 men and women use guns to commit suicide – by far the most popular method.

As Shakespeare might have written about American gun laws, “Now thou art come unto a feast of death, a terrible and unavoided danger.”
271 reviews3 followers
March 12, 2017
A wide-ranging examination of the many reasons people denying scientific evidence, with a particular focus on health evidence. There is a tendency among health care experts to attribute the attitudes of anti-vaxxers and those with other false health believes to a lack of "health literacy," which essentially just means we need to educate people better, and the public needs to learn not to believe everything they read on the internet (OK, I'm oversimplifying a bit). But this doesn't explain some of the people I've encountered in my life - the biologist with anti-vaccine beliefs, the relative who believed some of the crazier anti-Hillary conspiracy theories circulating during the recent US election. The authors, a public health researcher and her psychiatrist father, delve much deeper, examining what the latest brain science tells us about charismatic leaders, why some people are attracted to conspiracy theories, and the many difficulties our brains have with understanding scientific evidence. They make it clear there is no single cure to the epidemic of anti-science we are currently facing, but they propose a number of things that could help to improve the situation, from changing the way we teach science, to training scientists to be better communicators, to schooling health practitioners on more effective ways of communicating with their patients. Reading this in the aftermath of Trump's election, I found my faith in any of these prescriptions pretty low, but they're a good place to start, and it does help to better understand the nature of the problem.
53 reviews2 followers
November 2, 2016
Many years ago I taught AP Psychology. This book covers what I considered the most important part of that curriculum for my students - how to recognize the pitfalls of human thinking processes. Scientific and statistical reasoning fundamentals are lost on many average citizens and even the most educated and intelligent among us fall victim to flaws in human thinking. Confirmation bias, cognitive dissonance, availability huerisitic, causation vs. correlation, Type 1 errors, Type 2 errors, multi causality, null hypotheses, and lack of certainty in any experimental design results are all discussed in this book.

The authors use as examples the anti vaccine movement, the anti GMO movement, and the anti ECT movement to show that science has produced legitimate evidence to support vaccines, use of ECT and GMO and yet there are millions that refuse to believe the evidence when presented. The "how" of its presentation as well as the "who" of the presentation has an impact on whether any one's position can be changed.

Lack of critical thinking skills, lack of understanding of the use and abuse of statistics, and the ignorance of how experimental design is carried out and replicated is rampant in our society. It is NOT taught in our schools and that is why I believed it to be one of the more important aspects of my course. It was also one that often made my students stop and stare in amazement at the everyday examples I gave in class. This book would provide for me and for them more examples of our flawed thinking processes and the rabbit holes they often lead us down.

Refusing to question our conclusions when new evidence is presented - or even when old evidence is presented in a less threatening way - is the basis of denial. And denial is a powerful defense in not only addiction, but in our everyday society.

I highly recommend this book to those who want to see "how" we think ( process) is so often the problem in and of itself and not the content of what we are thinking about - whether it be GMO's, or vaccines, or ECT, or the recent political chaos and blatant flawed thinking that is rampant in every news cast on any channel.
Profile Image for Kay .
730 reviews6 followers
March 11, 2017
It's just so refreshing to read a book written by two really smart people, in this case highly educated father and daughter healthcare/health-related professionals. I'm always looking for ways to increase acceptance rates of information--especially that not understood by most people. So many (probably me included) are quick to dismiss those who are more knowledgeable and go with the gut or what one believes should be true. They address chief health-related concerns. The disconcerting thing is--throwing more facts at people will not sway their beliefs. Fortunately the book does offer some recommendations although it's always easier to work on oneself than others. It's also a great overview of the scientific method and why testing is important and especially important is that it be subject to replication. Even though the authors are trying to reach people, I did find times when I found it hard going which is why I give this 4 stars. I have a college degree myself (a B.A.) and had a couple of paragraphs where I had to look up the words to be certain of what was being stated. As an endorsement, I did want to be sure I understood what they were saying because understanding exactly what was meant was very important to me. For anyone interested in why we believe the things we do, how we analyze risk, and how social media is influencing people, this is an excellent book to read.
Profile Image for Chris Boutté.
Author 8 books278 followers
April 17, 2023
I’ve had this book for months and haven’t checked it out because it’s insanely long. I love reading books about science denial, but I figured this would just be super boring and tell me a lot of what I already know. With that being said, I’m super glad that I was wrong. Although this book is over twice as long as the books I normally read, I binged this book pretty quick because it’s so damn good.

Sara and Jack Gorman (I believe) are married, and they both have an extensive scientific background and have worked in the medical/pharmacutical industry. This book literally has everything you could want from a book about why people ignore science. It covers how to spot good science, the problems with science communication, charismatic leaders, mistakes the the government and CDC has made, a ton of research I was unaware of and so much more.

I could go on about this book forever. Like I said, although it’s long, it’s worth it. When people ask me for a book on this topic, I’m definitely pointing them to this one.
Profile Image for Andrew.
5 reviews
June 11, 2017
A great study of the fears and cognitive biases that lead people to adopt and hold onto irrational positions in the face of overwhelming evidence to the contrary. Much deeper, broad-ranging, and empathetic than I expected.

The lessons learned here are readily applied to other domains, such as (cough) US politics. It deserves to be widely read.
281 reviews2 followers
August 17, 2017
This book skillfully covers how humans operate and why. Many insights and lucid explanations with many mysteries remaining, of course. Well written.
Profile Image for Richard Derus.
4,194 reviews2,266 followers
May 18, 2025
PEARL RULED @ 40%

Rating: ?4?* of five

The Publisher Says: Why do some parents refuse to vaccinate their children? Why do some people keep guns at home, despite scientific evidence of risk to their family members? And why do people use antibiotics for illnesses they cannot possibly alleviate? When it comes to health, many people insist that science is wrong, that the evidence is incomplete, and that unidentified hazards lurk everywhere.

In Denying to the Grave, Gorman and Gorman, a father-daughter team, explore the psychology of health science denial. Using several examples of such denial as test cases, they propose six key principles that may lead individuals to reject "accepted" health-related wisdom: the charismatic leader; fear of complexity; confirmation bias and the internet; fear of corporate and government conspiracies; causality and filling the ignorance gap; and the nature of risk prediction. The authors argue that the health sciences are especially vulnerable to our innate resistance to integrate new concepts with pre-existing beliefs. This psychological difficulty of incorporating new information is on the cutting edge of neuroscience research, as scientists continue to identify brain responses to new information that reveal deep-seated, innate discomfort with changing our minds.

Denying to the Grave explores risk theory and how people make decisions about what is best for them and their loved ones, in an effort to better understand how people think when faced with significant health decisions. This book points the way to a new and important understanding of how science should be conveyed to the public in order to save lives with existing knowledge and technology.

I RECEIVED A DRC FROM THE PUBLISHER VIA NETGALLEY. THANK YOU.

My Review
: Over the years (almost ten since I got the DRC) I've tried and tried to finish this read. I'm already inside the church on the subject...the psychology of science denial...but I stall out on "Confirmation Bias" chapter, where the repetitious nature of the prose just overwhelms me.

We need this information on why people simply reject objective truth...they deny it *is* either of those things...but I can't say I think this compendiously-footnoted tome is the way to get that done.
Oxford University Press asks for $21.99 for an ebook.
Profile Image for Brenda.
484 reviews1 follower
January 23, 2021
I first became interested in learning about cults and science deniers back when the anti-vaxxers and Obama is a Muslim conspiracies bagan. I picked up this book the day after the riots on the Capitol to try to understand how so many people could be lured into outright lies and encouraged to put themselves and others in jeopardy, as this group did. It is hard for me to grasp, as I tend to research everything when making important decisions. There is a lot of science in this book, but the authors do a really good job of breaking things down into layman's terms.

The biggest takeaway that I saw was that- when a charasmatic leader spouts untruths, it doesn't matter how many others show evidence to the contrary, his followers will generally only believe him. In other words, if Dr. Andrew Wakefield would come out and say that his vaccine research was severialy flawed and in fact untrue, then anti-vaxxers would likely believe it. If Donald Trump came out and told his follwers that he was indeed lying and that the election was fair and legal, then they would likely believe him and realize their error. But the chances of them believing anyone else, even the Congress members who propagated Trump's lies, are very slim. They are in a cult, and he is their leader. We know that he will never tell them the truth, so now we must find other ways to try, slowly, to deprogram them through patience and facts over time.
Profile Image for Uninvited.
196 reviews9 followers
June 25, 2019
We all think that the main reason some people cling to unscientific notions, like, e.g., believing vaccines cause autism, or that gmos cause cancer, is mainly misinformation and/or stupidity. Although those reasons may exist in some cases, this book opens up a whole new perspective on why this happens. It might be the case that criminal negligence due to denial of scientific reason may not (always) be caused by misinformation/stupidity, but by different reasons, which can be acknowledged and addressed properly. A very important book that should be read by as many people as possible. (And, yes, it's a bit dry, it's a book written by two medical scientists, not by freakin' Dickens... Makes you wonder if some people know what they are about to read, when they open a new book.)
Profile Image for Lindsay Bolender.
570 reviews13 followers
July 6, 2020
Overall, this book was useful. It details the whole ton of science behind why people deny facts, and successfully debunks the myth that it’s always because people are uneducated or ignorant. A lot of the science was old news to me, having studied the psychology of education, but there were some things I didn’t know, and some things I hadn’t considered in the context of science denial. There were also some useful ways to think about how to approach a denial conversation in a way that will be successful. My only problem with this book was that the authors’ assertions were not necessarily supported with any research, and their political stance was very clear.
67 reviews1 follower
January 27, 2025
Very much enjoyed this book- I have struggled with how people can ignore science and believe charismatic leaders and the nonsense they spew. The authors help explain how the brain works and how our emotions, social and psychological needs prevail over rationality.

This book was published in 2017 soon after we as a nation panicked as the ebola virus reached our shores. I would love to know how the authors saw the response to COVID and all the naysayers. Maybe they can do a new preface to the book!

110 reviews
September 21, 2018
Very interesting topic as it pertains to my career and impacts how I would teach regarding vaccines etc. Not a top pick for most people however this could pertain to anyone who is interested in understanding how and why we make our decisions and stick to them/promote these ideas. I love this kind of stuff!
571 reviews
January 11, 2019
Very dry and repetitive. Basically tells us what we already knew, that those who disbelieve the scientific evidence generally do so based on emotionally charged “stories” that may or may not be actually true. Solutions are brief and difficult to implement. Not sure that this book will make a difference, but one can hope.
Profile Image for David.
603 reviews15 followers
June 29, 2019
Leaving science to scientists is dangerous. Unfortunately, the authors observe, "only after the false notions become embedded in people's minds does the scientific community mount any meager attempt to counteract it." Lastly, posing science as a threat to faith has had far-reaching negative consequences.
27 reviews
November 20, 2019
This book discusses some interesting research in why we think the way we do despite evidence to the contrary, but it too often reads like a textbook with very stilted writing. The authors lay out ideas for increasing science literacy, which would be a good start, but how many people want to think objectively?
Profile Image for Seth the Zest.
249 reviews4 followers
September 30, 2023
A dense read but surprisingly helpful with a colleague at work. The book goes into the many evoluntionarily beneficial ways the human mind works that do not help us process complex, scientific ideas. I'm interested about cognitive science and this book delivered in a big way. It also helped me understand why people distort events so readily, including me.
Displaying 1 - 30 of 46 reviews

Can't find what you're looking for?

Get help and learn more about the design.