Jump to ratings and reviews
Rate this book

Calculated Risks: How to Know When Numbers Deceive You

Rate this book
At the beginning of the twentieth century, H. G. Wells predicted that statistical thinking would be as necessary for citizenship in a technological world as the ability to read and write. But in the twenty-first century, we are often overwhelmed by a baffling array of percentages and probabilities as we try to navigate in a world dominated by statistics.
Cognitive scientist Gerd Gigerenzer says that because we haven't learned statistical thinking, we don't understand risk and uncertainty. In order to assess risk -- everything from the risk of an automobile accident to the certainty or uncertainty of some common medical screening tests -- we need a basic understanding of statistics.
Astonishingly, doctors and lawyers don't understand risk any better than anyone else. Gigerenzer reports a study in which doctors were told the results of breast cancer screenings and then were asked to explain the risks of contracting breast cancer to a woman who received a positive result from a screening. The actual risk was small because the test gives many false positives. But nearly every physician in the study overstated the risk. Yet many people will have to make important health decisions based on such information and the interpretation of that information by their doctors.
Gigerenzer explains that a major obstacle to our understanding of numbers is that we live with an illusion of certainty. Many of us believe that HIV tests, DNA fingerprinting, and the growing number of genetic tests are absolutely certain. But even DNA evidence can produce spurious matches. We cling to our illusion of certainty because the medical industry, insurance companies, investment advisers, and election campaigns have become purveyors of certainty, marketing it like a commodity.
To avoid confusion, says Gigerenzer, we should rely on more understandable representations of risk, such as absolute risks. For example, it is said that a mammography screening reduces the risk of breast cancer by 25 percent. But in absolute risks, that means that out of every 1,000 women who do not participate in screening, 4 will die; while out of 1,000 women who do, 3 will die. A 25 percent risk reduction sounds much more significant than a benefit that 1 out of 1,000 women will reap.
This eye-opening book explains how we can overcome our ignorance of numbers and better understand the risks we may be taking with our money, our health, and our lives.

320 pages, Paperback

First published January 1, 2002

118 people are currently reading
2276 people want to read

About the author

Gerd Gigerenzer

47 books311 followers
Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making, especially in medicine. A critic of the work of Daniel Kahneman and Amos Tversky, he argues that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.

Gerd Gigerenzer ist ein deutscher Psychologe und seit 1997 Direktor der Abteilung „Adaptives Verhalten und Kognition“ und seit 2009 Direktor des Harding-Zentrum für Risikokompetenz, beide am Max-Planck-Institut für Bildungsforschung in Berlin. Er ist mit Lorraine Daston verheiratet.

Gigerenzer arbeitet über begrenzte Rationalität, Heuristiken und einfache Entscheidungsbäume, das heißt über die Frage, wie man rationale Entscheidungen treffen kann, wenn Zeit und Information begrenzt und die Zukunft ungewiss ist (siehe auch Entscheidung unter Ungewissheit). Der breiten Öffentlichkeit ist er mit seinem Buch Bauchentscheidungen, bekannt geworden; dieses Buch wurde in 17 Sprachen übersetzt und veröffentlicht.

[English bio taken from English Wikipedia article]

[Deutsche Autorenbeschreibung aus dem deutschen Wikipedia-Artikel übernommen]

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
215 (32%)
4 stars
272 (40%)
3 stars
133 (20%)
2 stars
34 (5%)
1 star
11 (1%)
Displaying 1 - 30 of 74 reviews
Profile Image for Gumble's Yard - Golden Reviewer.
2,189 reviews1,794 followers
April 17, 2018
Interesting but disappointingly one-dimensional book whose sole idea is that:

The medical (and legal) community communicates risk badly, partly due to a misguided belief in infallibility/certainty but mainly due to an ability to understand or compute Bayesian type uncertainties (what is the probability of having breast cancer given you have a positive mammogram test; what is the chance of being the source if you have a positive DNA “fingerprint”) in cases where chances of a both a false positive and false negative may be small but the relatively low prevalence of the condition renders the frequency of false positives high compared to true positives (he in turn claims that these problems can be dealt with by a frequency representation).

However the book has very little to say on the contentious issue of assessing the prior in the first place (which in medical terms equates to understanding the prevalence of the disease; in legal terms it is the population of possible sources of the DNA)
Profile Image for Laleh.
117 reviews9 followers
April 26, 2017
I cannot recommend this book enough.

For me, as a medical student, the day-to-day applications of this book are endless.
Basically it teaches you to fully grasp the concept of uncertainty in situations, and convey it to other people in a way that doesn't distort any of the important stuff.

I've passed 4 statistics courses so far, and this book is by far the most useful text on correct use of statistics I have ever seen
Profile Image for Ken  Van Allen.
31 reviews4 followers
March 1, 2008
This book is a shortcut to statistical numeracy. Gigerenzer goes beyond being merely infomative, and helps the reader understand how to interpret, and what questions to ask to get the information needed to properly quantify risks.
Profile Image for ☘Misericordia☘ ⚡ϟ⚡⛈⚡☁ ❇️❤❣.
2,526 reviews19.2k followers
December 27, 2019
Q:
What is going on in our minds? (c)
Q:
My agenda is to present mind tools that can help my fellow human beings to improve their understanding of the myriad uncertainties in our modern technological world. The best technology is of little value if people do not comprehend it. (c)

And easy-peasy read on how if one uses statistics, they had better have a good grasp of it or ELSE.

I don't remember why I wanted this book but, well, it's an extra easy read for people thrice removed from anything quantitative.

Nice trivia bit. Just how many people were wrongly convicted due to this?
Q:
Recently, however, the Federal Bureau of Investigation performed a test of the reliability of fingerprint evidence that had never been done before. In 1998, Byron Mitchell appealed his conviction for having driven the getaway car in a robbery in Pennsylvania in 1991. The conviction was based on two latent fingerprints, one found on the steering wheel and the other on the gearshift of the car. The FBI decided to test the reliability of the reported match and sent the latent fingerprints along with Mr. Mitchell’s inked prints to the laboratories of 53 various state law enforcement agencies. From the 35 laboratories that responded, 8 failed to find a match for one of the prints, and 6 failed to find a match for the other, making the average failure rate one in five tests. This troublesome result casts considerable doubt on the reliability of fingerprinting. America’s National Institute of Justice has finally provided funding to study how good fingerprinting actually is.
Fingerprint evidence has been accepted as certain for more than a century, following Galton’s estimate. This calculation was made for ideal conditions, which are not found in the real world of incomplete and latent prints. When DNA fingerprinting was introduced into the courts, almost a hundred years after Galton’s seminal work, the public and many experts projected the illusion of certainty onto this new technology. As we see in Chapter 10, DNA fingerprinting has also been declared “failsafe.” The fiction that fingerprinting, DNA fingerprinting, HIV tests, or other excellent new technologies are absolutely foolproof is like a dream that comes back night after night, fulfilling a deep unconscious wish. (c)
Q:
When I was a child, I was told on good authority never to drink water after eating cherries, or I would get very sick and might even die. It never occurred to me to doubt this warning. One day I shared an ample serving of cherries with an English friend who had never heard of this danger. To my horror, I saw him reach for a glass of water after eating some of the cherries. I tried to stop him, but without success; he just laughed. He took a sip, and nothing happened. Not only did he not die; he did not even get sick. That experience cured me. (c)
Q:
A psychiatrist friend of mine prescribes Prozac to his depressive patients. Like many drugs, Prozac has side effects. My friend used to inform each patient that he or she had a 30 to 50 percent chance of developing a sexual problem, such as impotence or loss of sexual interest, from taking the medication. Hearing this, many of his patients became concerned and anxious. But they did not ask further questions, which had always surprised him. After learning about the ideas presented in this book, he changed his method of communicating risks. He now tells patients that out of every ten people to whom he prescribes Prozac, three to five experience a sexual problem. Mathematically, these numbers are the same as the percentages he used before. Psychologically, however, they made a difference. Patients who were informed about the risk of side effects in terms of frequencies rather than percentages were less anxious about taking Prozac—and they asked questions such as what to do if they were among the three to five people. Only then did the psychiatrist realize that he had never checked how his patients understood what “a 30 to 50 percent chance of developing a sexual problem” meant. It turned out that many of them had thought that something would go awry in 30 to 50 percent of their sexual encounters. For years, my friend had simply not noticed that what he intended to say was not what his patients heard. (c)
Q:
Susan’s ordeal illustrates the illusion of certainty; the Prozac and DNA stories are about risk communication; and the mammogram scenario is about drawing conclusions from numbers. This book presents tools to help people to deal with these kinds of situations, that is, to understand and communicate uncertainties. (c)
Q:
illusory certainty seems to be an adaptive response that for ages has protected humans, especially children, from trying to learn first-hand about possible dangers, such as which kinds of food are poisonous and which are not. Similarly, young children are prepared to believe in values, rules, and stories without question, which facilitates their integration into their social group and culture. Social conventions— whether learned from one’s family or one’s wider culture—are, like elementary perception, a source of the illusion of certainty.
Illusory certainty is part of our perceptual, emotional, and cultural inheritance. It can provide us with images of our environment that are useful, although not always correct, as well as with feelings of comfort and safety. (с)
Q:
Throughout history, humans have created belief systems that promise certainty, such as religion, astrology, and divination, systems in which people—particularly those experiencing terrible suffering—can find comfort. (c)
Q:
Certainty has become a consumer product. It is marketed the world over—by insurance companies, investment advisers, election campaigns, and the medical industry. (c)
Q:
In seventeenth-century Europe, buying life insurance meant making a bet on the duration of a prominent person’s life, such as whether the mayor of Paris would die within three years. If he died within the period on which you had wagered, you made a small fortune. Nowadays, insurance agencies have persuaded us that life insurance is about safety and certainty and that it is morally responsible to bet against our own lives for the sake of our survivors’ economic security. (c)
Q:
Patients tend to develop views of “fate” or “Inshallah” rather than learning to practice informed consent. (c)
Q:
Institute of Medicine estimated that some 44,000 to 98,000 patients are killed every year in U.S. hospitals by preventable medical errors and misadventures. It’s as if one lived in a culture where death is a desirable transition from one life to a better one.
Dr. B: Isn’t that a bit much? That’s saying more people die from hospital accidents than from motor vehicle accidents, or from AIDS.
...
These errors were preventable, such as when a physician prescribed an antibiotic to a patient with a history of documented allergic reactions without consulting the medical records. The general problem in medicine is that, unlike in aviation, there is no system for reporting errors without punishing the individual doctor. Pilots anonymously report “near misses” to a central database so that other pilots can learn from them and improve air transport safety. Aviation has focused on building safe systems since World War II, and U.S. airline fatalities have decreased ever since. In 1998, there were no deaths in the United States in commercial aviation. Health care has no such system. (c)
Q:
If I were to start explaining to patients the benefits and harms of a potential treatment, they would hardly comprehend it. (c)
Q:
The reported risk of an Ariane accident was, hence, based on a propensity, not a frequency, interpretation. (c)
Q:
this book presents mind tools for overcoming innumeracy that are easy to learn, apply, and remember. I focus on three kinds of tools: Franklin’s law for overcoming the illusion of certainty, devices for communicating risk intelligibly, and the use of natural frequencies for turning clouded thinking into insight. (с)
Profile Image for Kris Fabick.
57 reviews5 followers
May 11, 2017
Although this book was pretty dense and took me a while to read, I think Gigerenzer did an excellent job of explaining (often convaluted and complex) statistical reasoning in simple terms so that a common (non-mathematically excitable) person can see the error that so many professionals (doctors, lawyers, and scientists) make every day. This book offered me that unique experience when you have felt something to be true for so many years in your life but never quite knew how to put words to the murkiness in your head. He clearly expresses the REASON people are often confused by statistical risk calculations and defines appropriate and immediately applicable solutions to help our society become numerate (the mathematical equivalent of being literate) and capable of making our own decisions about what risks we are willing to take in regards to health screenings (mammograms, HIV testing, etc.) and interpreting, potentially life-altering, data in regards to criminal trials that use fingerprint matching or DNA profiling.

Many people believe cancer screenings are ALWAYS beneficial, but Gigerenzer makes some compelling arguments about how some screenings detect "cancers" that would never amount to anything in the timespan of one's life (yet women will often have entire breasts removed to prevent a cancer that would never have caused them any symptoms or discomfort, much less death). Furthermore, many screenings involve the use of x-ray technology that can actually INDUCE cancerous cells to begin growing. This is just one example of how reading this book has enlightened me as to some of the ways our culture breeds a widespread false sense of security due to our ignorance and lack of education about risks and how to talk about them.

As a math educator, I felt particularly piqued and called to action by the examples in this book. Eventually, I would like to teach Statistics courses in which I may formally address such innumeracy issues as Gigerenzer would hope. In the meantime, I still feel a responsibility to my students to try to instill in them an understanding that the world at large (including mathematics!) has more that is unknown, debated, and up for judgment than is certain and stable. Presenting students with real-world court cases that have been overturned due to the defendant's attorney clearing up statistical misunderstandings (by simply using natural frequencies like 4 in 1,000 rather than probabilities, like 0.4%) used to convince a jury of peers (most likely, not mathematicians!) in the first trial or with real-world vignettes of people who have received a positive HIV result, lost their job, marriage, and home; had unprotected sex with other HIV-positive individuals (because, why not at that point?); and killed him/herself only to find out in an autopsy that the positive HIV result was a false-positive will certainly hold students' attention and help them to see the value (and danger!) of innumeracy.

Profile Image for Squib.
125 reviews1 follower
August 12, 2023
Apparently, GPs and surgeons are not as numerate as you would hope. This made me feel better as I don't know all my times tables although obviously it didn't make me feel better in terms of being a patient. Gigerenzer presents many jaw-dropping stories in a soothing Vulcan manner, mainly to do with health screening (breast cancer in particular), DNA as evidence, HIV (as in people being told they have it when they don’t, and vice versa), and the spin put on statistics by using relative risk as opposed to absolute risk (or vice versa, depending on the spin)
Profile Image for Joseph.
68 reviews3 followers
November 9, 2007
Worth a read if you want to start thinking about the data that is quoted at you from various sources. How to think about probability & statistics *easily*, the statistical "illiteracy" of professions that need to know better, how stating the same thing in different ways yields different reactions from people...

Excellent book.
Profile Image for Anne (ReadEatGameRepeat).
852 reviews79 followers
June 15, 2020
This was an interesting read. I had to read it for a risk communication and uncertainty module as a part of my masters course. I thought things were explained really well and there were interesting examples but to be honest at one point all the disease things just started getting a little triggering for me and I ended up putting the book down for a good while. I am excited to read more by this author even if it's not for a course.
Profile Image for Dr. Tobias Christian Fischer.
706 reviews37 followers
February 25, 2021
Vertraue keiner Statistik, die du nicht selber gefälscht hast. So kann eine Zusammenfassung des Buchs lauten. So einfach und so banal. Egal ob DNA Test oder Wahrscheinlichkeiten - sie sollten einfach und verständlich sein.
Profile Image for Brian Powell.
204 reviews36 followers
January 15, 2020
People suck at probabilities. Apparently highly-educated professionals like doctors and lawyers are indisputably awful at math. Thinking in terms of natural frequencies—the raw numbers—is better for most people, including the doctors and lawyers who are indisputably awful at math. This awfulness is thrust upon the general public, and it has lead to wrongful imprisonment, unnecessary surgeries, and suicides.

Some examples: Researchers and marketers of diagnostic tests are generally woefully (maybe willfully?) ignorant of false positive rates, false negative rates, and how these compare to rates of incidence and why this totally matters when deciding whether to get tested. People have committed suicide over false positive AIDS tests.

Beware of various shenanigans in the marketing of such tests, like references to the generally higher relative risk reduction when trying to sell a test to the public. People have had breasts removed on the basis of faulty screening that they were enticed into receiving given the impressively high but mostly irrelevant measure of relative, versus absolute, risk reduction.

Lawyers will confuse juries, judges, reporters, and themselves over what a person having a match to crime scene DNA means about that person's likelihood of guilt. Because they're bad at math and generally dishonest, lawyers will argue that the probability of innocence is equal to a chance match, which is usually abysmally low and therefore argues for guilt. For example, say a person has a 1 in a million match to some DNA. Sounds guilty, right? But what if there are 2 million people in the city matching the other characteristics of the suspect? Now it's only 50/50. People have actually gone to prison and been put to death over this kind of thing. Yes, important stuff and this book exhaustively and minutely explores these errors in risk calculation and communication. The main themes are strong but few, and I got bogged down in the too many examples.

In short, here's a maxim to protect us from this kind of dangerous, shoddy thinking and manipulation: Always keep Bayes' theorem in your back pocket so that you can interpret risks appropriately.
Profile Image for Terrie.
349 reviews8 followers
February 14, 2010
Great book. I was surprised to find that it was from 2002 b/c the information about the questionable benefits of regular mammographies in women < 50 was just recently in the news. The book is largely slanted toward other similar examples in health and medicine; the chapter on DNA evidence and how poorly it's presented in court was eye opening.

Conversely, great tips for how to obfuscate and take advantage of the general public's innumeracy when you are presenting data.
Profile Image for Victoria.
4 reviews
December 8, 2013
The author was very didactic. His language was precise and unambiguous, without being filled with unnecessary jargon. Frequently, he reiterated many points from early in the book in later chapters. This guy stands firmly by his claims and backs it up with research. I gave this book 4 stars because the content is original and eye-opening. Other math books reference this book.
Profile Image for Irio.
21 reviews35 followers
May 31, 2015
Tirando a impressão que tive sobre descrever demais - por vezes falar o que já havia sido falado, dá lindos insights sobre como lidar com informações numéricas de um modo geral.
Profile Image for Clive F.
180 reviews18 followers
September 3, 2021
This is a lucid and compelling book about how poorly we understand risk, and how we may understand it much better simply by looking at it in a slightly different way. Right now, we're all faced with statistics and percentages that can crucially affect the health of ourselves and our loved ones, so this is a really timely book to dive into.

The author's basic argument is that we don't understand risks -in particular, conditional risks - when explained in percentages, but we do understand them when explained in what he calls "natural frequencies". He explains this very carefully, so don't worry if you didn't get past percentages at school - that's his point, you don't need to.

Here's an example of a conditional risk: for asymptomatic women aged 40 to 50
"The probability that one of these women has breast cancer is 0.8%. If a woman has breast cancer, the probability is 90% that she will have a positive mammogram. If a women does not have breast cancer, the probability is 7% that she will still have a 'positive' mammogram.
Imagine a woman who has had a positive mammogram. What is the probability that she actually has breast cancer?"

When asked this question, most physicians - even specialists in the field - get the answer wildly wrong. 24 physicians with an average of 14 years experience gave estimates ranging between 1% and 90%, with a third of them giving a probability of 90%.

Coming up with the answer of 90% is called "base-rate neglect" - ignoring the fact that breast cancer is relatively uncommon. For example, just because 90% of people from Norway are blonde, it wouldn't be the case that if you met a blonde person, there's a 90% chance they're from Norway. There are lots of blonde people who aren't Norwegians! Similarly, in this case, because 7% of people without breast cancer still end up with a positive mammogram, and there are many more people without breast cancer than with it, we can't use the 90% number.

Here's the same data, presented as natural frequencies, which makes the whole thing much clearer:
"Eight out of every 1,000 women have breast cancer. Of these 8 women with breast cancer, 7 will have a positive mammogram. Of the remaining 992 women who don't have breast cancer, some 70 will still have a 'positive' mammogram. Imagine a sample of women who have positive mammograms in screening. How many of these women actually have breast cancer?"


Presented in this way, the answer is obvious: 77 women out of 1000 got a positive mammogram, but only 7 of them actually had breast cancer. Not a percentage in sight. The other women were the victims of 'false positives'. These can occur for all sorts of reasons, but they cause real problems - as the author says
"False positives take a considerable toll on women's bodies and psyches. About half of women who participate in regular screening are affected by this cost of mammography screening"

Gigerenzer gives example after example: PSA tests for prostate cancer in men, FOB tests for colon cancer in anyone. Professionals don't explain risks well, and almost everyone has trouble understanding these kinds of conditional probabilities when they're presented as percentages. Indeed, in many cases, it's very likely that the risks are deliberately being presented in a misleading way, to serve some particular agenda.

Here's an example of different ways of presenting the same numbers which can lead the reader in one direction or another: in a study of treating coronary artery problems with bypass surgery versus nonsurgical medical therapy, 350 out of 1,325 patients died with surgery (26.4%), while 404 patients out of 1,324 (30.5%) died with the nonsurgical therapy.

How is this best presented?
- the absolute risk reduction is 4.1% (30.5% minus 26.4%);
- the relative risk reduction with surgery is 13.4% (because 4.1% is 13.5% of 30.5%);
- The percentages of surviving patients are 73.6% and 69.5%, for surgical versus nonsurgical treatment;
- the number of patients needed to be treated to save one life is 25 (for every 100 patients treated surgically, 4 more will survive).

Which presentation option is chosen really matters! 140 members of UK health authorities were given this data in different ways, and while 76% of them were willing to fund the surgical treatement when given the relative risk reduction number, only 53% were willing to fund exactly the same treatment when they were instead given the percentages of surviving patients.

Although the numbers do not lie, the same underlying underlying facts can persuade or dissuade, depending on how they're used. But - and this is the author's crucial point - not if we really understand the numbers, presented as natural frequencies!

There's a lot I didn't cover in this review - the discussion of problems with DNA and fingerprint evidence is eye-opening, for example - so I would definitely encourage you to read this yourself. How many discussions of COVID-19 data have you seen, where the same numbers were being used and abused by different sides to support their own agenda? This book will help you sort through the mess, and come to your own conclusions!
15 reviews
July 11, 2021
This book gives various real-life examples showing how difficult it is for most of us to reason in probabilities, such as the following. Suppose studies say that about 90% of women with breast cancer will test positive in a mammogram. What can you conclude from this data if your mammogram result comes positive? It seems, based on the studies made by the author himself, that many, including medical practitioners, would conclude that your chances of having breast cancer are high, about 90%. Such estimation is wrong by a large margin, though. Why do so many practitioners and patients fail on this estimation?

The author insists that the problem lies in the way we calculate likelihood, that we tend to use the Bayes formula based on probability values while we should be using frequencies instead. He is right. Calculating conditional probabilities using frequencies is easier, more intuitive, and less error-prone than using the Bayes formula. But it seems to me that we fail before the calculation attempt. We fail on formulating the right question. See, in the example above, the 90% value is drawn from a population of seek people. You don't know yet whether you belong to that population, which is probably the whole point of getting tested. In this example, you belong to the population of women who tested positive, not to the population of women with breast cancer yet. Therefore, what we need to know are the chances of having breast cancer given a positive test, which is different from the chances of testing positive if one had breast cancer.

Such subtleties in a problem formulation are notoriously difficult to catch. And I strongly advice the reader to look for them when reading this book. A good illustration is provided in Chapter 9. The author tells the story of a husband accused of killing his wife. The prosecution argued that a history of spousal abuse reflects a motive to kill his wife. The defence counterargued saying that such argument is nonsensical, because of 100000 women that are battered by their husband or boyfriends, only 45 are killed by their battered. This gives a low probability indeed, of 0.00045. In this case, the defender was talking about the chances that a woman that have been battered by her partner is killed by her partner. But, but, but, the spouse of the accused is already dead! So we should be interested instead on the chances that a "murdered" woman that have been battered by her partner was killed by her partner. Omitting the detail that the woman is already dead was actually very clever from the defence. In this new problem formulation, the chances increases all the way up to roughly 90%, because amongst 45 murdered women that were previously battered by their partner (the population the defender was referring to), 40 were killed by their partner. So battering is indeed a good predicator of whether a husband is the murder or not, as the prosecutor was claiming.

The conclusion is that we should be crystal clear and precise about what we are measuring, evaluating, assessing, discussing, etc. Such skill is hard to master, but worthwhile pursuing.

What about risk? In fact, I took this book thinking that it would help me to understand risk in various aspects. It didn't really. The book may help you to understand how risk is expressed, that's all. That said, the principles the books teaches are applicable to the analysis of risk. For example, the author dedicates some time to explain why relative risk is misleading, and that we should be looking at absolute risk instead. Assume you are considering flying. Then I tell you that the risk of dying in a plane accident is 50000000% higher when you get into a plane than when you don't (people on the ground may still be crashed by a plane). You may find my statement either silly or intimidating. That's because I used risk as the ratio between your odds of dying in a plane crash as a passenger and the odds of dying as a non-passenger (relative risk), which might not be supper useful. Now, if I were to express risk in absolute values, then I would tell you that your chances of getting hit by a plane on the ground are 0.0000000000001, your chances of dying as a passenger are 0.000005, which means that your risk has increased by 0.0000049999. Only then you can conclude that the risk of being involved in a plane accident as a passenger is actually very low, it just happens that the odds of dying as a non-passenger are a lot lower.

If, incensed by the vagueness of my previous statement, you jump from your chair and throw your computer through the windows, then you may not need this book. Otherwise, pick it up and have fun!
Profile Image for minhhai.
141 reviews17 followers
March 11, 2018
The book discusses different presentations on statistical numbers, especially rare incidents such as HIV infection and breast cancer in which false positives are prevailent. The author persuasively argues that ineffective presentations using relative or single-event probabilities clouds the mind of readers (including medical and law professionals) and induces misleading perspectives. He then advocates the use of natural frequency approach to convey risk assessment.

This is an important topic for everyone. I myself has encountered such a false positive. Despite regarding my maths skill above the average, I didn't know how to deal with the false positive and negative rates. The book's natural frequency approach just comes natural, clear and very effective.

The majority of the book is not about probability theory, but on its social impact, especially in medical examinations and legal cases. Any symptom or evidence comes with certain degree of uncertainty and evaluating that uncertainty is a crucial step towards conclusion. The book lays out a number of stunning examples in which mispresentation of risks caused unfortunately sad ending: huge cost for healthy people, wrong conviction of innocent.

This is an insightful and easy read without any maths required. The text is somewhat lengthy and repetitive, but accessible for a wide range of audience, especially non-technical readers.
Profile Image for Bernard English.
264 reviews3 followers
October 13, 2019
Really useful guide to thinking a bit more clearly about simple probabilities encountered in every day life. includes a useful glossary at the end. Illustrates how to use tree diagrams to simplify probability problems presented with percentages. He goes over the different measures of risk so as not to be so easily fooled by medical and pharmaceutical claims. The book's message may seem somewhat simplistic to those with a life science or mathematical background, but if you aren't sure why relative risk measures may be misleading, read it! If you are blindly trusting your medical professional for advice on whether or not to get surgery or to interpret some diagnostic test--read the book. And finally, my most embarrassasing misunderstanding--I was taken in by the prosecutor's fallacy. He illustrates this with the controversial O.J. Simpson trial of the '90s. DO NOT serve on a jury until you are clear about the fallacy and how legal evidence can be manipulated. I got over it though, since many doctors and lawyers also have problems with the interpreting probabilities and statistics. Of historical note, the notorious Dryfus case involved an "abuse of probability theory" according to D.H. Kaye as explained in his essay "Revisiting Dreyfus: A More Complete Account of a Trial by Mathematics."
Profile Image for Maurizio Codogno.
Author 66 books143 followers
November 17, 2017
L'idea di base era ottima: riuscire a dimostrare anche a chi non ha studiato statistica che le probabilità assegnate ad alcuni eventi non sono affatto quelle che si pensa. Anche gli esempi sono scelti in maniera da interessare: vedere come al processo contro O.J. Simpson l'avvocato difensore è riuscito a girare le carte in tavola e convincere la giuria che la probabilità che il suo cliente fosse un assassino fosse molto più bassa della realtà; oppure calcolare come essere positivi a un test antiAIDS non significa poi molto se non si appartiene a un gruppo a rischio.Però Quando i numeri ingannano è stato almeno per me una delusione. Non so se è perché i concetti io almeno in teoria li conosco, oppure perché il libro è troppo ripetitivo e io mi annoio in fretta. Ma garantisco che lo stesso materiale io l'avrei condensato in duecento pagine.
Profile Image for Toni Borisov.
26 reviews
July 26, 2020
This book is an introduction to statistical thinking. Gigerenzer shows how easily science research could be misinterpret or blatantly used to deceive the majority of people due to their innumeracy. He teaches the ways by which a professional in a certain field could understand probabilities better so that he would communicate the risks properly.

Not only that but in this book you will read about true and horrifying stories about people who suffer, get accused of a crime which they haven't committed or even die due to the unwitting or potentially deliberate fallacies committed by lawyers or health professionals.

These are real problems of today's society and few people talk about them. Statistics is important, read "Reckoning with Risk" and know why!
Profile Image for Abdullah Cemil Akcam.
40 reviews
August 13, 2022
One of the best books i have ever read explaining how to understand uncertainty. We live in an uncertain world in which events are happening with some probabilities but we as humans are not fond of uncertainties. We think that most things are certain including medical tests, expert opinions and evidence used by court rulings. I liked this book most because of its guiding viewpoint into the real life I am a part of everyday . I can now look at the life differently thanks to examples provided in the book. Lastly, the natural frequency representation of risks as an alternative to probabilities is a great invention giving people the power to grasp complex situations involving conditional probabilities, imo. I recommend everyone to read this excellent book.
Profile Image for Hess.
315 reviews8 followers
February 24, 2025
This book has a fantastic central message - namely that statistical innumeracy is a framing problem - buried in an avalanche of disturbing facts and anecdotes.

I didn't come out of this book feeling wiser or more math-confident than I felt going into it. To that end, I am now reading Gerd's follow-up, Risk Savvy.

I did however, come out of this book feeling severely chastened about the social risks caused by innumeracy, the problem of passively accepting ourselves as incurably "innumerate", and the need to do something about it.

A disturbing and important read, and one that I'd recommend to any law or medical student going into those fields today.
124 reviews
July 1, 2019
Extremely approachable presentation of confrontational thought provocative ideas that question how we perceive our ability to think.
I strongly recommend this read to all, to inspire such worthy topics of thought as; questioning decision-making in the context of personal health & applying available inferences from statistical analysis of collected data sets.
33 reviews
July 7, 2019
Very important theme: the need to break the illusion of certainty. The author gives plenty of examples and illustrates ways to fight innumeracy. However at the end everything seems to boil down to present conditional probabilities as natural frequencies. This is a great idea but there are plenty of other ways to fight innumeracy
This entire review has been hidden because of spoilers.
124 reviews
September 26, 2023
A very difficult book to read perhaps this is user error and the fact I do not understand the background matter of statistics. This is true.

It's unfortunate the author could not explain parts of this to a layperson with some aid.

It's not a good book to pick up casually and was quite a frustrating read.
Profile Image for Adrian Salajan.
11 reviews24 followers
February 27, 2021
Just a few but very powerful ideas in how to interpret and look at percentages and statistics.
It is full of medical examples which I sometimes wished they were fewer and skipped some, but at the end of the book I found that I remembered more ideas than from other books.
33 reviews
June 2, 2024
Eine interessante Erkenntnis (Menschen sind schlecht darin Wahrscheinlichkeiten in % zu verstehen und sehr viel besser darin natürliche Häufigkeiten, z.B. "1 Person von 10.000" zu verstehen) wird auf 330 Seiten plattgetreten. Wäre auch auf 100 Seiten gegangen.
2 reviews
January 9, 2025
Helps understand how to see the risk from data, statistics points rather than emotions. Must read this in my opinion for everyone as even though it talks about lot of medical application, it can easily be applied to any areas where risk exists (business, industry etc)
Profile Image for Phil.
260 reviews3 followers
November 12, 2018
2.5 an important idea but it could have done in a short paper. Book was kind of repetitive.
60 reviews7 followers
February 11, 2019
Understanding data and stats in today's world is critical. And could even save your life!
Displaying 1 - 30 of 74 reviews

Can't find what you're looking for?

Get help and learn more about the design.