Jump to ratings and reviews
Rate this book

Risiko: Wie man die richtigen Entscheidungen trifft

Rate this book
Erinnern wir uns an die weltweite Angst vor der Schweinegrippe, als Experten eine nie dagewesene Pandemie prognostizierten und Impfstoff für Millionen produziert wurde, der später still und heimlich entsorgt werden musste. Für Gerd Gigerenzer ist dies nur ein Beleg unseres irrationalen Umgangs mit Risiken. Und das gilt für Experten ebenso wie für Laien. An Beispielen aus Medizin, Rechtswesen und Finanzwelt erläutert er, wie die Psychologie des Risikos funktioniert, was sie mit unseren entwicklungsgeschichtlich alten Hirnstrukturen zu tun hat und welche Gefahren damit einhergehen. Dabei analysiert er die ungute Rolle von irreführenden Informationen, die von Medien und Fachleuten verbreitet werden. Doch Risiken und Ungewissheiten richtig einzuschätzen kann und sollte jeder lernen. Diese Risikoschulung erprobt Gigerenzer seit vielen Jahren mit verblüffenden Ergebnissen. Sein Fazit: Schon Kinder können lernen, mit Risiken realistisch umzugehen und sich gegen Panikmache wie Verharmlosung zu immunisieren.

400 pages, Kindle Edition

First published March 18, 2013

332 people are currently reading
5496 people want to read

About the author

Gerd Gigerenzer

47 books311 followers
Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making, especially in medicine. A critic of the work of Daniel Kahneman and Amos Tversky, he argues that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.

Gerd Gigerenzer ist ein deutscher Psychologe und seit 1997 Direktor der Abteilung „Adaptives Verhalten und Kognition“ und seit 2009 Direktor des Harding-Zentrum für Risikokompetenz, beide am Max-Planck-Institut für Bildungsforschung in Berlin. Er ist mit Lorraine Daston verheiratet.

Gigerenzer arbeitet über begrenzte Rationalität, Heuristiken und einfache Entscheidungsbäume, das heißt über die Frage, wie man rationale Entscheidungen treffen kann, wenn Zeit und Information begrenzt und die Zukunft ungewiss ist (siehe auch Entscheidung unter Ungewissheit). Der breiten Öffentlichkeit ist er mit seinem Buch Bauchentscheidungen, bekannt geworden; dieses Buch wurde in 17 Sprachen übersetzt und veröffentlicht.

[English bio taken from English Wikipedia article]

[Deutsche Autorenbeschreibung aus dem deutschen Wikipedia-Artikel übernommen]

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
605 (32%)
4 stars
793 (42%)
3 stars
365 (19%)
2 stars
90 (4%)
1 star
14 (<1%)
Displaying 1 - 30 of 183 reviews
Profile Image for Tevfik.
Author 18 books603 followers
June 17, 2017
I was in favor of Kahneman & Tversky ecole, a branch of bounded rationality called "cognitive biases and heuristics" until I read this book. Now I am noticed that there is another point of view called "fast and frugal heuristics", led by Gerd Gigerenzer. This book partially explains this approach, which shows that some rule of thumbs relying on heuristics may perform better than a complex strategy suggested for a problem. I have also become aware of demarcation between risk and uncertainty by reading this book. Moreover, I think the book fulfils its purpose, making the reader a risk savvy person.

I strongly recommend this book, for both people interested in a life with risk literacy and experts studying risk, cognition or safety management issues (including medical doctors).
Profile Image for Andy.
849 reviews5 followers
November 13, 2018
A high level discussion that tends to use substantially more words than are necessary to get the point across. Written like most business books where you get a kind of vague point, followed by an anecdote, and maybe a slightly better explanation of the point. My bigger issue is that Gigerenzer dismisses the idea put for by Kahneman, Tversky, and others without honestly engaging with them. Kahneman's and Tversky's idea, as identified in Thinking Fast and Slow does not say that intuition is always error ridden and that conscious thought is always logical and computational. They also do not laud the value of complex models, nor do they pretend their model applies universally. Mischaracterizing these arguments in order to dismiss them undermines Gigerenzer's credibility, especially when his actual statements seem to agree with various aspects of Kahneman and Tversky.

Gigerenzer's lauding of the value of intuition and rules of thumb is also problematic because he elides over the idea that intuition must be trained into a person. Intuition is useful in situations where a person has ample opportunity for feedback, in a repeatable environment, and lots of experience. Gigerenzer kind of hits this point, but then also lauds business intuition, which fails all of these ideas. In fact, studies of the very class of leaders that Gigerenzer lauds for there intuition fail to demonstrate benefits significantly above the mean.

All in all, this book has something interesting to say, but the message gets undermined by overreaching and gross mischaracterization of supposed opponents.
36 reviews6 followers
February 10, 2017
I live in Japan and work for a global company. At every 6 months or so I would receive a notice from both the company and the ward office of where I live in, calling for breast cancer free check up that are held on regular basis.
I never go for it, feeling that the earlier I find out about anything like breast cancer, the less happily I could live the rest of my life. If the cancer were there, it were there, medical intervention would not do much help with serious illness like that, so why ruining your life living in fear and anxiety and painful chemo treatment if eventually you would die anyway?
After reading this book. I'm empowered that somehow with using gut feeling, I was right! And it equipped me with scientific facts support my believing such as leading time, over-diagnose problem and other intentional and unintentional statistical misleading. I am surprised by the fact that the majority of people even in developed countries are fooled so well by the existing market-oriented medical system. It should not be in the way it is right now. I agree with the author in the point that it's our civil right that we should be provided with objective facts, be risk-literacy educated so that we could make informed decisions for our lives. For my part, I am more determined to ask doctors who treat my child about the grounding of their treatment. Doctors in Japan treat patients with tenderness but often with paternalism, they tell us what(the sickness), how(how to cure, what medicine to take) ( and only if I ask), hardly ever why or what are the alternatives.
Brilliant and most useful book ever! Everyone should read it in his own benefit.
Profile Image for Akim McMath.
2 reviews1 follower
January 12, 2015
H. G. Wells once predicted that “statistical thinking will one day be as necessary for effective citizenship as the ability to read and write.” Many decades later, we are as clueless about risk as ever—and at a heavy price. Ignorance about risk lies behind innumerable contemporary problems, from the rising cost of healthcare to the recent global financial crisis. In Risk Savvy, Gerd Gigerenzer seeks to change that. Not only does this book demonstrate how and why we fail to understand risk. It also provides the tools we need to move toward a risk-savvy society.

Here is a pop quiz. A 50-year-old woman receives a mammogram and tests positive for breast cancer, although she has no symptoms. The prevalence of breast cancer among women like her is 1 percent. The test is not always accurate: where breast cancer exists, the test will detect it 90 percent of the time; where there is no cancer, the test will come up positive 9 percent of the time. What is the probability that the woman has breast cancer?

If you said 1 in 10, you have done very well. Most people—including most doctors—vastly overestimate the probability. And it isn’t just conditional probabilities such as these that throw people off. In the 1990s, many women stopped taking birth control pills because they increased their (relative) risk of thrombosis by 100 percent. If they had known that their absolute risk was only increased by 1 in 7000, many unwanted pregnancies and abortions might have been avoided. Similarly, many men get screened for prostate cancer when they hear that those who get screened have a survival rate of 82 percent, while those who don’t have as survival rate of 44 percent. If they knew that their chance of dying of prostate cancer—their mortality rate—was unaffected by screening, many men might be spared unnecessary procedures and their nasty side effects.

Perhaps the most serious mistake that people make is the failure to distinguish known risk from uncertainty. We are dealing with known risk when all of the possible outcomes are known and the probability of each outcome can be computed. Otherwise, we are dealing with uncertainty—a very different kind of thing. This is the kind of mistake that can lead to global financial crises. When bankers mistake uncertainty for known risk, they create the illusion of certainty—and they have the mathematical models to prove it. None of these models, however fancy, were able to avert catastrophe in 2008.

So far, the story seems bleak. Are we hopeless when it comes to risk? Do we need to call in the experts to make all our decisions for us, lest we destroy ourselves? No, says Gigerenzer. This book is not just about our failings in dealing with risk, but also about how we can overcome those failings. And many of the solutions are surprisingly simple.

One thing we can do is to present statistical information in natural frequencies rather than conditional probabilities. Here is the mammogram quiz again. For every 1000 women, 10 will have breast cancer. Out of those who have breast cancer, 9 will test positive. Out of those without breast cancer, 89 will test positive. What is the probability that a woman who tests positive has breast cancer? Now, the answer is clear. Out of 98 women who test positive, only 9 have breast cancer. Natural frequencies are just as informative as conditional probabilities, with the added benefit that they do most of the calculation for us. Even fourth-graders, when presented with natural frequencies, outperform doctors presented with conditional probabilities.

Similarly, information should be presented in terms of absolute risk rather than relative risk, and in terms of mortality rates rather than survival rates. Relative risk and survival rates are typically useless, but most of us find them very convincing. The media regularly use relative risks to generate eye-catching headlines, while unscrupulous hospitals and drug companies use survival rates to inflate the efficacy of their products and services. Transparent framing of statistical information is one of the simplest ways to ensure a risk-savvy citizenry.

When it comes to uncertainty, we need an entirely different set of tools than those that are used for dealing with risk. We need to scrap the mathematical models and focus on developing good heuristics, or rules of thumb. Heuristics are often derided as crude approximations to careful consideration of all available information. This may be true under known risk. But under uncertainty, heuristics are often both faster and more accurate. Simple rules have been shown to outperform complex models in situations ranging from consumer behaviour to stock market investment. And successful executives are more likely to rely on rules of thumb than on complex mathematical models, even if they pretend otherwise. When it comes to uncertainty, less is often more.

Risk Savvy is a hugely important book. It shows where we routinely go wrong when it comes to dealing with risk, often with disastrous consequences. More importantly, it shows us how we can do better—a simpler task than it may first appear. And it does all this with masterful clarity and concision. Everyone—from investors, to patients, to educators, to policymakers—will benefit from this book.
Profile Image for RoWoSthlm.
97 reviews22 followers
October 13, 2018
There are numbers of good books about risk, and I would like to place Gigerenzer's "Risk Savvy" into this category. Need to admit, I wasn't at all interested in it when it appeared in the bookshelves. I was thinking I wouldn't have time for yet another book on Risk, but when I heard that he argues against Kahneman and Tversky, it caught my interest. It's a tall order to question the school of Kahneman and such a bestseller as "Thinking, fast and slow". However, it is always worth to have a contrarian attitude and look at things from quite an opposite angle.

One of my favourite areas of exploration is the concept of risk. What seemingly should be an easy concept to grasp, the more you know of it, the more fuzzy it gets. Especially if you're looking into quantitative aspects of risk. Quantitative risk management approaches require a very rigid understanding of the underlying concepts of risk and related terms, otherwise, models won't make much sense, and, usually, risks we're chasing would realize on us in a totally unforeseen ways.

Gigerenzer's book serves well in stretching the understanding of risk and probability. According to the common view, we, humans, are probability-blind and predictably irrational . The author provides some useful tools for dealing with risk and uncertainty, arguing that it is perfectly possible to remove our seemingly hardwired cognitive biases. The three important angles of probability are discussed in the book. As I understand it, Gigerenzer proposes methods for training of our fast thinking system by equipping us with thumb-rules for many different life situations and thus becoming an effective heuristic being where the gut-feeling plays a central role in decision making. This stands in contrast to fact and data-based approaches, and the book gives some good arguments why the risk savviness is better.

Apart from author's reasoning about the gut heuristics, which is really interesting, the book has some other good material on risk, uncertainty and probabilities. There some very good examples from healthcare and how statistics is employed there to rob patients. Statistically-enabled fear is a great weapon for the greedy schemes which seems to exist in many healthcare systems.

This book has massaged those sceptical parts of my brain so well that I will probably re-read it after a while.
Profile Image for Jeff.
119 reviews31 followers
March 14, 2014
The version of the book I read had the following notation: "Advanced uncorrected proofs - not for sale." Of course I can only comment on that version, and not on the final copy.

I received this book for free through Goodreads First Reads.

This book is hard to rate objectively. Any assessment will be based at least partly on the level of desire the reader might have to wade through some fairly complex calculations. (In certain situations, Mr. Gigerenzer promotes the "simpler is better" theory which can be a blessing to most of us lay people. But I believe the book could be read more easily if, in cases where calculations are complex and require significant time explaining, he had given the basic premise, and put further explanations and calculations in an appendix).

With that said, there is much to be commended. Of course, the proof is in the pudding -- the reader will have to apply some of Mr. Gigerenzer's suggestions to see how they compare with the standard ways of doing things that we have been told are the best ways. Mister Gigerenzer speaks out against the status quo, and proposes educating people about risk assessment starting from childhood. Some issues that are discussed:

Are statistics reported to show the real picture, or are media attention or personal gain factors?

Are experts always to be trusted in their assessments? What guides experts in their decisions? Can the average person be trusted to learn & understand risk assessment?

How does the information the public receives regarding medical, travel, and other information affect their health and safety positively or negatively?

How should we balance calculating risk assessments mathematically versus using our intuition when considering known or unknown risks? How does a person's experience and expertise guide them toward using calculations versus intuition? Do experts properly assess the level of certainty?

What underlies a gut feeling? Is it God's voice? Or something else?

Should we ask an expert what he would recommend in our situation? Or is there a better question?

What is the Monty Hall problem? Can it benefit us in day to day life?

Options for making decisions among many options (e.g., "satisficing")

How does early screening for cancer, etc, impact our health, finances, & psyche? What are the benefits and downsides of early detection of diseases like cancer? Are the effectiveness vs. risks of treatments being properly explained? How do false positives and false negatives impact how doctors treat us and how we decide on how/when to be treated?

What's more accurate: Survival rates or mortality rates?

Can icon boxes help us and experts better understand risks?

Is absolute risk assessment or relative risk assessment a better tool to show the big picture?

How does fear of risk help or harm how the next generation learns?

Are there more deadly diseases, disasters, etc today than in the past? Or is how they're reported skew our perceptions?

My gut tells me that Mr. Gigerenzer is more right than wrong in his accusations and proposals. It may take some time for me and other readers to fully gauge how right (or wrong) he is. But one thing is for sure: His message needs to be heard and discussed--and no doubt will be. Many experts will no doubt be up in arms by some of his suggestions, but maybe that's not a bad thing. The most important thing is that honorable motives, and clear and accurate statistics, are being used when information is being provided.
728 reviews315 followers
September 6, 2016
"GET SCREENED NOW! Early detection saves lives. The 5-year survival rate for breast cancer when caught early is 98%. When it's not? 23%."

Actually, there's no evidence that screening for breast cancer, or any other type of cancer, saves lives. So is Komen lying about the statistics? No. But did you notice that they're talking about "5-year survival rate," not mortality? 5-year survival rate is meaningless because it's distorted by lead-time bias. Don't know what lead-time bias is? Read this book. It helps you sharpen your bullshit detector about health, finance, policy, romance, and other things.

We should pass a law and not let anyone graduate from high school before getting a minimum level of grasp on statistics and probability. It's depressing enough when the general public can't think clearly through concepts like risk and uncertainty, but it's a lot worse when experts and policy makers fail as well. Get this: Gigerinzer goes to a medical seminar and presents the audience (160 gynecologists) with this simple question: Assume that the probability of a woman having breast cancer is 1%, and the probability of a false-positive test is 10%. If a woman has tested positive, what is the likelihood that she has cancer? You expect trained doctors, making critical decisions for us, to know the answer, but most couldn't solve this simple problem. Most people think the answer is 90% because the probability of a false-positive test is 10%. The correct answer is that there's only 10% chance that she has cancer. Can't see why? Read the book.

Here's something less morbid. If the chance of rain on Saturday is 50% and on Sunday is 50% too, what is the chance of rain over the weekend?
Profile Image for د.أمجد الجنباز.
Author 3 books807 followers
December 5, 2020
الكتاب الأهم الذي يتحدث عن الإحصاء وشرح أهميته بشكل مبسط
وتوضيح المخاطرة وعلاقتها بالاحتمالات والإحصاء وعلاقتها بكل ما هو حولها
ويتحدث عن اهمية الاحتمالات في حياتنا اليومية والمالية وحتي عندما يتعلق الموضوع بالأمور الصحية
فماذا تعني احتمالات هطول الأمطار؟
ماذا تعني نسبة الخطأ في العملية أو التحاليل الطبية؟
من الأخطر الحصول على الطاقة من الفحم أم من المفاعلات النووية؟

كتاب قوي للغاية
Profile Image for Thomas Edmund.
1,085 reviews85 followers
August 23, 2020
Risk Savvy is an unusual but helpful tome on risk assessment in a modern world. Gigerenzer skewers several topics, pretty broadly from terrorism, to cancer screening all linked with an overall focus on making accurate, useful and evidence based decisions.

At the moment I have a slightly cynical perspective on my non-fiction, judging books heavily on whether they fit with the shitshow that is 2020, and Risk Savvy passes the test! In fact I think this book is quite useful reading to help make sense of the current pandemic, not that it will provide you with all the answers, but more a sensible way of approaching thinking about risk.

The style of Risk Savvy is blunt but enjoyable - and focused on improving human wellbeing for all, a laudable goal!
Profile Image for Sameer Alshenawi.
245 reviews22 followers
September 17, 2017
Eye opener .
It addresses risk vs uncertainty , and how often we mistakingly think that risk is unceratinity and uncertainty is risk. It advocates use of rules of thumb or simple ones instead of the sophistaced systems of risk management.
We need educating people in health, financial, and digital risk literacy.

Highly recommended
Profile Image for Kenny Kidd.
175 reviews7 followers
October 29, 2021
I am a boring middle-aged dad and I thoroughly enjoyed this non-fiction book about the various ways that we are risk illiterate and how we can correct that!

This is written in a really engaging, unpretentious, non-academic way, but it covers a lot of valuable ground: how we navigate uncertainty vs. how we navigate risk, and the proper ways to approach both (which are often more simple than we make them out to be), how group psychology often gets in the way of making the correct decision, and how a very cursory understanding of statistics can really help anyone immensely in navigating the challenges of life—it sounds real boring, but it’s not! At least I don’t think it is 🤷‍♂️

I do think the author is a bit too optimistic in thinking that educating people to be more effective at navigating risk will alleviate most harm in society, not paying enough attention to systemic issues that make this kind of education insufficient, but that’s not really the focus of the book so it’s not a huge detractor. Certain sections towards the end regarding how to manage public health crises with a profit-seeking media often perpetuating fear inordinately for gain would almost CERTAINLY need to be re-written post-COVID, but the points he makes are cool and sound still!

It also makes another nice addition to my “capitalism bad” bookshelf, with its comments on the health care and media systems, so that’s neat!
3 reviews
February 1, 2017
I am constantly afraid of making wrong decisions. I hate it when there is no way to tell if my choices are leading where I want. Therefore I often try to get even the smallest pieces of information before making my choice, balancing carefully out every detail.
...
Turns out it is not the right way to do things. More information does not always give more certainty, as Gigerenzer argues in his book using good examples, clear statistics and simple math. In this world, ever more complex, we must know the difference between risk and uncertainty. As it is impossible to predict the future, the next best option is to spread knowledge and to make people risk savvy. This would make a surprisingly large impact on everyday issues such as money investments, leadership, gambling and love.

Moreover, after reading this book, I would consider twice what a doctor tells me, as even the most educated individuals are not safe from misinterpreting risk.
Profile Image for David Hammond.
Author 1 book12 followers
May 22, 2014

Essential information for anyone receiving medical treatment

An introduction to understanding risk - particularly in medicine. For example - a woman has a positive mammogram for cancer - what is the probability she actually has cancer? 8 in 10, 1 in 10 or 1 in 100?

The answer is 1 in 10. How many gynecologists answered this question correctly? Only 21 percent chose 1 in 10.

The author details how pharmaceutical companies and health providers exaggerate the benefits of their products and play on our fears in the name of profit. He also highlights the differences between the airline industry and the health industry. Airlines' mistakes are investigated and solutions implemented, while medical mistakes are buried (figuratively and literally).

Reading this book could save you from much pain and suffering.
Profile Image for Dr. Tobias Christian Fischer.
707 reviews37 followers
February 26, 2021
Bauchgefühl: schon mal gehört? Hör doch mal auf dein Bauchgefühl. Manchmal wirkt es so, als ob das die Hauptmitteilung ist.
163 reviews9 followers
March 24, 2015
This was a book that I thought was amusing, not fantastic, and then overall came around to thinking it was pretty good. Overall, Gerd Gigerenzer talks about how we often make decisions based on a misunderstanding of the data and that we need to do a better job of being what he calls being “risk literate.”
In the first part, he talks about how people internalize a risk if a lot of things happen at once (like 9/11 attacks) where there is a low probability of catastrophe, but that people don’t always see the potentially worse risks that are a higher probability, but lower severity (like individual car accidents that are much more frequent but don’t kill as many people in each event). People sometimes act suboptimally when faced with these types of tradeoffs. “If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.” (12) We often overestimate the probability of catastrophe, sometimes resulting in bad results. “How many miles would you have to drive by car until the risk of dying is the same as in a nonstop flight?...the best estimate is twelve miles.” (13)
He then goes on to talk about the differences between uncertainty and risk. “In a world of known risk, everything, including the probabilities, is known for certain. Here, statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” (24) We are often given a false sense of certainty by complicated models when they might not do a very good job of predicting anything. He talks about the rule of thumb that pilots can use regarding whether they can make a runway: “Fix your gaze on the tower: If the tower rises in your windshield, you won’t make it.” (28) They also talk about the turkey illusion, where we base our perception on the future on the risk (previous observations of being fed) and not uncertainty (someone “preparing” for Thanksgiving).
The next section discusses defensive decision making. They talk about how pilots use checklists, but surgeons don’t, often to the peril of their patients. “Defensive decision making: A person or group ranks option A as the best for the situation, but chooses an inferior option B to protect itself in case something goes wrong.” (56) People often use a rule of thumb like hiring a recognized company for a job rather than who they actually think is the best because they think it will be more defensible if something goes wrong. Some types of defensive medicine they mentioned include superfluous tests, more prescriptions than necessary, unnecessary referrals to specialists, and invasive procedures to confirm a diagnosis. (59) A remedy that he provided was: “Don’t ask your doctors what they recommend to you, ask them what they would do if it were their mother, brother, or child.” (63)
Another interesting section was on minding your money. He noted that the predictions of financial experts are not typically all that great. His alternative that he proposes is much simpler. “Allocate your money equally to each of N funds.” (93) He advocates this over a mean-variance optimized portfolio. “I have convinced myself that simple is better. But here’s my problem. How do I explain this to my customers? They might say, I can do that myself.” (95) This is true, but wouldn’t it be better to just advocate keeping 1/N as one of the options when picking the best model? I would argue against adding unnecessary complexity, but it seems like the advisor is just bad if they pick models on something other than performance adjusted for other important things (such as risk). He says that when we have “high uncertainty, many alternatives, and small amount of data” we should make it as simple as possible. Alternately, when we have “low uncertainty, few alternatives, and a high amount of data” we should make a model more complex. (96) He advocates allocating savings equally across stocks, bonds, and real estate. (105) That certainly is simple, but it likely is not the best risk profile for many people.
The next part talked about gut decisions. I thought he had a little more faith in gut decisions than I do, but he does make some good points. “A gut feeling, or intuition, is a judgment (i) that appears quickly in consciousness, (ii) whose underlying reasons are fully fully aware of, yet (iii) is strong enough to act upon. Having a gut feeling means that one feels what one should do, without being able to explain why. We know more than we can tell. An intuition is neither caprice nor a sixth sense, but a form of unconscious intelligence. By definition, the person cannot know the reasons, and may invest some after the fact if you insist. To take intuition seriously means to respect the fact that it is a form of intelligence that a person cannot express in language. Don’t ask for reasons if someone with a good track record has a bad gut feeling.” (107) I thought one good point he made was that there is often, and for good reason, a desire for people to only accept evaluations if a person has a list of reasons why. By definition, you can’t explain a gut feeling. I think the important thing is to look at the track record of a person who is sharing their gut feeling – I have known people with very reliably good intuition, and others that are worse than useless.
He goes on to talk about natural frequencies and their superior ease of comprehension compared to conditional probabilities. This makes sense – there is no need to overcomplicate things. He also talks about the illusion of certainty, and how we think we have a better grasp of what is going on with slot machines, for example, than we really do. (136) “If you are highly proficient at a sport, don’t think too long about the next move. If you are a beginner, take your time in deciding what to do.” (137) This goes along with the gut feelings chapter, since you have muscle memory and intuition once you have repeated something many, many times, and the danger of “overthinking” something exists. When you don’t have much experience, though, you have not had enough encounters to develop a reliable gut feeling, and need to resort to other methods.
The next part talks about getting to the heart of romance. One idea I thought was interesting was to toss a coin to help with decisions and then not look at the result. Which result were you hoping for? (149) I thought the discussion about how middle children always get less total attention because if you spread your time equally (1/N), you will never have any time where the middle child doesn’t have to split time with at least one other person (typically), so N is on average larger for the middle child than for the oldest or youngest. (154)
I thought the parts on medicine were really interesting. One of the more discomforting things in the book was the discussions regarding the probability that a person has an illness given a positive test result, and how many doctors were way off when it came to this knowledge/understanding. Regarding breast cancer, “out of ten women who test positive in screening, one has cancer. The other nine women receive false alarms. Yet the 160 gynecologists’ answers, monitored by an interactive voting system offering the four choices above, were all over the map…Only 21 percent of doctors would correctly inform women.” (163) He again promoted the idea of looking at natural frequencies (that is, the frequency of something occurs, rather than probabilities, or conditional probabilities, so saying something like 6 in 1,000 will die from a disease). Regarding Down syndrome, “only one out of every six or seven women with a positive result actually has a baby with Down syndrome.” (172) I really liked the discussion of how doctors have what he calls SIC Syndrome. “Physicians don’t do the best for their patients because they: 1. Practice devensive medicine (Self-defense) 2. Do not understand health statistics (Innumeracy) 3. Pursue profit instead of virtue (Conflicts of interest). (178) He talks about how the benefits of many tests like MRI and CT scans are discussed with little discussion about their drawbacks. These are implemented because of the S (if they did a test they did all they could) and the C (financial rewards for running tests), not because it was necessarily in the best interest of the patient. These tests in particular can even be harmful for the patient, as long-term radiation exposure has negative impacts on your health. I thought the discussion of mortality rate versus 5 year survival rate was interesting, as well. He talks about two different kinds of bias present in diagnosis. Lead time bias is present when diseases that take a long time to kill you are diagnosed much earlier (say, at age 60), resulting in better 5 year survival rates. If you wait until they are much sicker (say, at age 68), the 5 year survival rate might be lower, but it doesn’t mean that a higher fraction of people lived past the age of 70. Another type of bias is overdiagnosis bias. “Overdiagnosis happens when doctors detect abnormalities that will not cause symptoms or early death. For instance, a patient might correctly be diagnosed with cancer but because the cancer develops so slowly, the patient would never have noticed it in his lifetime.” (188) If the same number of people die from a cancer (the numerator), but you dramatically increase the number of people who have some form of it (the denominator), the mortality rate can drop dramatically. Like with x-rays, the biopsies done to better diagnose prostate cancer are not without harm. While some tests and screenings are very useful, it sounds like the PSA screening for prostate cancer is not nearly as reliable. I think one of the takeaways is that it’s important to communicate how reliable a test is, and that not all tests are great. “In addition, almost half of the U.S. doctors falsely believed that detecting more cancers proves that lives are saved. Under the influence of their confusion, they would recommend screening to paitents.” (200) He advises to look at the mortality rates rather than the 10 year survival rates, doing as best as you can to compare apples to apples. He then advocates fighting cancer (and other diseases) with prevention, not screening. This is obvious – it’s better to be proactive rather than reactive – but prevention is something I should have done for a long time, and screening is something I can do while still not adhering to a strict diet or otherwise healthy lifestyle. While we spend a ton of money on research for drugs to cure cancer, he suggests that it would be better spent on education and promotion of healthy lifestyles, since somewhere between 40 and 50 percent of cancer is essentially due to either smoking or obesity/diet/lack of exercise. (221)
He concludes by talking about how it’s important for people to be comfortable with numbers, understanding risks, and that natural frequencies can help with this.
I thought this was a decent book. I thought he was a little more into rules of thumb, but I think the general message of only adding complexity if it substantially adds to the benefits is a good message. I thought he had more of an emphasis on using gut feelings specifically rather than a more general "simple is better" that I would have preferred, but it still made some good points. I thought the discussions regarding how much other people you typically encounter know about risk and statistics is good as well. Given that I have a decent background in math and statistics, it’s beneficial to know what other people are and are not familiar with. I also liked his emphasis on properly accounting for the negative impacts of tests. More generally, what are the costs for doing too much testing and delaying decisions? We often frame things as is this good or bad (narrow framing), when we instead should look at it in the context of all available options.
Overall this was pretty good, especially in how it made you question the analysis you are given, and be wary of creative liberties with statistics.
Profile Image for Emma Veitch.
26 reviews
July 2, 2017
Somewhere between 3 and 4 stars. I did really, really enjoy this book but it wasn't quite the all-encompassing mind-exploding stream of consciousness brilliance that I hoped it would be. I think that the author (GG) is absolutely bang-on on pretty much everything, but a few little bits and pieces got lost in translation. Perhaps, it was too "popular" for me and I might have appreciated more detail, more depth on some of the threads he explores. I also felt overall the book was a bit "bitty". There is a focus on heuristics and his idea that "fast thinking" (rules of thumb) can be as good as "slow thinking" (formal decision tools). There is a bit about giving people health information - that everyone can learn to understand health risks/statistics, and what it means to convey clearly and accurately, info about health. That was the part I really "wanted" and was most drawn into in the book. There were examples ranging from banking, mad cow disease, swine flu, aeroplane safety, driving, whatever - but this all contributed to, in places, there being a lack of a "central thesis". My biggest criticism is that there really are two "central theses", and they didn't quite hang together - for example, I didn't really grasp why the "heuristic" idea feeds into, or interplays with the stuff about communication of statistics in health. Do these form part of a single worldview about making good decisions? I wasn't really sure. Thinking slow, could be as compatible with the communication of absolute risks, as rules of thumb. I wanted GG to explain this to me in much more depth. I would have preferred a single entire book on the communication aspects, and with more depth on the principles and evidence for this. Things I loved:
- The graphics and examples
- The compelling and clear explanations
- The empowering nature of his proposals - everyone can understand this, even school children who haven't yet been taught fractions
- The generality of application of some of the ideas
Overall, I felt that everyone who cares about how information in health should be communicated, should read this book, but I would like to follow it up with some of GG's papers, and perhaps (to read the opposite side of the story), some of the "nudge" literature (Kahnemann etc) and maybe Nassim Taleb (which I've been putting off for a while).
Interesting to read this in the week that Speigelhalter said "A sloppy attitude towards statistics has led to exaggerated and unjustified claims becoming commonplace in science" (in a public lecture) - https://www.theguardian.com/science/2.... I get the sense GG has been saying this stuff for decades, nearly, but it's taken a long time for formal explications of risk to be set out for the public in a sensible way. Here's to more of that.
8 reviews
March 12, 2021
Gerd Gigerenzer makes a compelling case about why we collectively, as a society, need to increase our risk-literacy in order to liberate ourselves from the overregulation of both government and industry (when they are the ones imposing it, pro-free-market companies suddenly loooove regulation. Who would have thought?). The extreme paternalism and the way-too-glorified concept of “nudging” practiced in an infinite number of fields today are more damaging than we think. they are impeding our ability to think independently and decreasing our chances to become rational, self-sustaining citizens.

For too long, the prevailing view amongst experts and policy officials alike has been that regular people do not have the satisfactory amount of “brains” to make their own decisions, that they have to be guided to the correct answer. Instruct over inform, that is what message has been! Does instructing over informing seem like a policy that incentives free and independent citizens? Yeah, right! An instructed citizen is as free as a draft horse: seemingly free to roam where it pleases, yet all the while its range of vision is constricted, its direction nudged by an invisible puppet-master.

As Gigerenzer correctly points out, the dud regulatory measures are produced not so often out of malevolence (although sometimes that is the case) but from the fact that experts and practitioners themselves do not a comprehensive understanding of risks or do not know how to communicate risk properly. Better to just insulate us and everybody else from risk in that case, right? Not so fast. Risk is vital. It is a prerequisite to life and all activity. Trying to compress risk in well-defined, neat little boxes and believing that one has found a way to avoid them all is more often than not a fool’s paradise, creating an illusion of certainty that only manages to make us oblivious to the real, hidden risks.

What is one to do then? Gigerenzer argues for rules-of-thumb, or heuristics, that everyday people can incorporate to live healthier, wealthier, and more fulfilling lives. These mental models help people reclaim their own destiny, instead of leaving it in the hands of companies and government officials. It is time to become risk-savvy!
113 reviews51 followers
June 19, 2021
A lot of examples from the medical World, hence, makes a good and important reading for medical doctors. Simple math and statistics (which appears to be very complex for common people to understand) is explained in a very lucid way in this book, especially, regarding the collection and analysis of data, and presentation of results. These are exploited by many business organizations, and misunderstood by many Government organizations. The decisions made and their implementation cost the tax payers heavily, and indeed, in some cases, cost the lives of thousands of citizens, too.
One of the sentences that made me think a lot in this book is: "Complex problems do not always require complex solutions." There are many more great one liners inside.
Highly recommended for anyone who thinks s/he is afraid of statistics. This book will remove your fear and enable you to grasp and apply it practically for your benefit. For more, I would also recommend the following books:
Statistics done wrong: The woefully complete guide by A Reinhart.
Weapons of math destruction: How big data increases inequality and threatens democracy by C O'Neil.
Innumeracy: Mathematical illiteracy and its consequences by J A Paulos
How not to be wrong: The power of mathematical thinking by J Ellenberg.
I enjoyed each of these books!
3 reviews
July 20, 2018
Todella mielenkiintoinen aihe ja lähestymistapa. Todennäköisyyksiä ja riskejä käsiteltiin luontevasti tarinoiden avulla.

Paras anti kirjassa oli mielestäni intuition arvostaminen sekä nyrkkisääntöjen korostaminen. Terveydenhoitoon liittyvää asiaa oli mielestäni liikaa (vaikka se oli luonnollista kirjoittajan taustan huomioiden), mutta muuten kirja piti otteessaan alusta loppuun.
Profile Image for Michelle.
280 reviews19 followers
April 1, 2018
Good overview of common sense decision and risk taking principles. Chapter 6 about leadership and intuition was valuable to understand some of the cray decisions I’ve seen taken in my career. Suggest skimming this and honing in on chapters of interest to you.
Profile Image for Anne (ReadEatGameRepeat).
854 reviews79 followers
August 27, 2020
I think this was an interesting read, but since I read his other book, Reckoning with Risk, earlier this year I didn't get as much from it as I thought I would. The author recycles a lot of his points and facts, so it didn't bring many new things as I thought it would.
278 reviews
August 29, 2021
As good as hoped for.
Had read some of his academic papers and his more mainstream oriented writing did not dissappoint.
A couple oldies (mamografy results, money mistique) but definitely at least a dozen nice heuristics…which was his main point: not all progress will come from throwing more data and computing power at our problems. We need risk-savy humans as well: know when to trust your gut.
Recommended.
Profile Image for Sarah Thornton.
773 reviews10 followers
Read
August 15, 2023
A little limited in scope - it does explain statistics well, but only as they relate to cancer, health, terrorism and finance.
Profile Image for Lisa Osada.
Author 1 book22 followers
October 17, 2024
Heute geht es um das Buch „Risiko“ von Prof. Dr. Gerd Gigerenzer. Obwohl es bereits im Jahr 2014 erstmals erschienen ist, hat es nichts von seiner Aktualität eingebüßt – im Gegenteil, es ist auch im Jahr 2022 ein hochaktuelles Werk, das sich mit der Kunst der Risikoabschätzung und Entscheidungsfindung beschäftigt. Der renommierte Psychologe und Risikoforscher Gigerenzer führt seine Leser auf anschauliche Weise durch die Welt der statistischen Trugschlüsse und regt zu eigenständigem und kritischem Denken an.

Die vorliegende Rezension bezieht sich auf die 3. Auflage des im Pantheon Verlag (erstmalig 2020) erschienenen Titels.

Inhalt und Aufbau
Das Buch gliedert sich in drei Themenbereiche:

Psychologie des Risikos – Hier analysiert Gigerenzer die menschliche Tendenz, Risiken falsch einzuschätzen und beschreibt, warum wir uns oft von vermeintlichen Expertenmeinungen oder Statistiken in die Irre führen lassen.
Risikokompetent werden – In diesem Teil des Buches vermittelt der Autor hilfreiche Werkzeuge und Denkansätze, um Risiken besser einordnen und bewerten zu können. Ein Schwerpunkt liegt auf dem Verständnis der Unterschiede zwischen relativen und absoluten Risiken.
Früh übt sich – Dieser Abschnitt konzentriert sich auf die Vermittlung von Risikokompetenz bei jungen Menschen und zeigt, wie wir schon früh lernen können, mit Unsicherheiten umzugehen.
Von Wetterberichten, die Regenwahrscheinlichkeiten angeben, bis hin zu komplexen medizinischen Fehleinschätzungen im Gesundheitssektor führt Gigerenzer seine Leser durch alltägliche und herausfordernde Situationen, in denen wir oft unbewusst Fehler bei der Risikobewertung machen. Ziel des Autors ist es, uns in die Lage zu versetzen, Aussagen und Wahrscheinlichkeiten richtig einzuordnen, um bessere Entscheidungen treffen zu können.
Stärken des Buches
Eine der größten Stärken des Buches ist Gigerenzers Fähigkeit, komplexe wissenschaftliche Konzepte in einer klaren und verständlichen Sprache zu erklären. Er verwendet anschauliche Beispiele, die jeder aus dem Alltag kennt, um zu zeigen, wie wir Risiken oft unbewusst falsch einschätzen. Besonders gelungen ist sein Abschnitt über den Unterschied zwischen relativen und absoluten Risiken, ein Thema, das in der öffentlichen Diskussion oft missverstanden wird.

Darüber hinaus hat das Buch eine starke aufklärerische Wirkung. Es ermutigt den Leser, kritisch zu denken, Statistiken zu hinterfragen und sich nicht blind auf Expertenmeinungen zu verlassen. Gigerenzer vermittelt nicht nur theoretisches Wissen, sondern auch konkrete Handlungsempfehlungen, wie wir im Alltag risikokompetent handeln können.

Kritische Betrachtung
Trotz der vielen Stärken gibt es auch einige Aspekte, die kritisch betrachtet werden können. Der Fokus auf den Gesundheitssektor ist stellenweise sehr stark, was bei Lesern, die sich für andere Bereiche des Risikomanagements interessieren, weniger gut ankommen könnte. Auch einige Wiederholungen und die Betonung der Schwächen der medizinischen Risikokommunikation könnten als etwas einseitig empfunden werden.

Wer sollte dieses Buch lesen?
„Risiko“ ist ein hervorragendes Buch für alle, die lernen wollen, Risiken besser einzuschätzen und fundierte Entscheidungen zu treffen – sei es im Alltag oder im Beruf. Besonders nützlich ist es für Menschen, die häufig mit statistischen Daten und Wahrscheinlichkeiten zu tun haben, zum Beispiel im Gesundheitswesen, in der Wirtschaft oder in der Finanzwelt. Auch für Leser, die sich für Psychologie und Entscheidungsforschung interessieren, bietet das Buch wertvolle Einblicke.

Fazit
★★★★☆
Es ist ein lehrreiches und gut verständliches Buch, das den Leser anregt, seine eigene Risikokompetenz zu stärken. Gigerenzer liefert praxisnahe und wissenschaftlich fundierte Ansätze, um in einer immer komplexer werdenden Welt bessere Entscheidungen treffen zu können. Trotz einiger Wiederholungen und der starken Fokussierung auf den Gesundheitssektor bietet das Buch wertvolle Denkanstöße und hilfreiche Strategien für den Umgang mit Unsicherheit. Es wird mit BBB+ bewertet, was hier 4 Sternen entspricht.

die „allgemeine Unfähigkeit“ von Ärzten, potenzielle Risiken zu verstehen und zu kommunizieren. Nichtsdestotrotz hat mir das Buch sehr gut gefallen und an einigen Stellen die Augen geöffnet. Ich bewerte das Buch mit BBB+, was hier 4 Sternen entspricht.

Risiko
Bildquelle: Risiko
35 reviews
July 5, 2025
Viele sehr interessante Beispiele, aber teilweise etwas langatmig
Profile Image for Benjamin Williams.
59 reviews
November 29, 2020
I’d give this book a 3.5 if I could. First third, great read and info that I found highly applicable. Second third, used the definitions and made the understandable. Last third, I think he has a vendetta against the healthcare industry and sets out to disprove many screenings and preventive medicine and procedures. Not bad stuff, just seemed pretty biased.
Profile Image for Alex.
94 reviews
March 17, 2024
I approached the book with the expectation that it would provide me with strategies on how to find answers to difficult life questions.

You won't find these strategies in the book.
I also found it difficult to read, as the author's approach and the mindset behind it did not match mine. Many things are explained and pointed out in a know-it-all manner and from above.

Therefore, it is definitely not a must read, unless you want to hear that:
*logical thinking is advisable in most life situations
*everything about medicine in the USA is bad
*nobody has a clue about risk and risk assessment
*everyone (especially insurance companies and hospitals) are evil and just want your money
35 reviews1 follower
July 15, 2017
This is a very good book about common misconceptions in understanding risks in everyday life as well as decision theory.
People are persuaded to undertake medical screenings without any additional benefit, because the underlying probabilities of studies are misunderstood by doctors. The finance industry gives consistently wrong projections of future market prices, nevertheless many people trust them their money. Gigerenzer shows that having no screening at all or invest your money yourself by applying simple rules such as "1/N" is usually the better solution.
But this book is not only about the wrong interpretation of probabilities, it shows the limits of probability theory by introducing (at least for me the distinction was new) the distinction of risk and uncertainty. When dealing with risk one can usually calculate probabilities as all relevant factors are known, while in situations of uncertainty it is not good to apply probability theory but thumb rules and trust your intuition.
Furthermore Gigerenzer investigates error cultures in two different environments (pilots vs. doctors) where the latter has no error culture which leads to defensive decisions (self protection). Gigerenzer brings this, in order to explain why doctors prefer to send patients to screenings which might no be necessary at all. It is furthermore something that I have observed myself in many (big) organizations and is in my opinion a major source for bad decisions and operational risk.

To summarise, the book has a clear focus on examples from medicine (probably due to Gerd Gigerenzer's own background). Hence, the financial market side is only covered a little, although I am convinced that one would find equally many cases, where probabilities or statistics are misused when calculating risks. One can however see, that Gigerenzer's background is not in finance, as he criticizes the high leverage ratios of banks and the argument of banks that raising capital is too expensive. He seems to think that raising equity is without costs (which actually has usually higher costs than debt) as he recommends this to banks instead of taking debt. Nevertheless, I think his diagnosis of leverage ratios being too high is correct, only the solution is too simplified. This shall only be a small sidenote and does not affect the content of the whole book much at all. If one is more interested in the misuse of probability theory in financial industry, Nassim Taleb's books can be recommended as additional literature.

Overall, I can recommend this book to really anyone, because everyone of us has to make decision when their outcome is uncertain and this book delivers some advice how to deal with such situations and how to find quick and good solutions.

224 reviews3 followers
November 20, 2022
After seeing Steven Pinker quote in Rationality: What It Is, Why It Seems Scarce, Why It Matters many papers authored by Gerd Gigerenzer, I picked up this book. I am glad that I finished it, because Gigerenzer writes from an angle quite different from the one shared by Kahneman, Tversky, and other behavioral economists. It is best to use Gigerenzer's own words:

It is sometimes said that for Kahneman, the glass of rationality is half-empty, and for Gigerenzer, the glass is half-full. One is a pessimist, the other an optimist. That characterization misses the point. We differ in what the glass of rationality is in the first place. Kahneman and followers take logic or probability theory as a general, "content-blind" norm of rationality. In their thinking, heuristics can never be more accurate, only faster. That, however, is true only in a world of known risk. In an uncertain world, simple heuristics often can do better. The real research question is to understand why and when. The answers we know today are based on the bias-variance dilemma (chapter 5; Gigerenzer and Brighton 2009) and the general study of ecological rationality (Todd, Gigerenzer, and the ABC Research Group 2012).


The book carries three main messages. First, heuristics and rules of thumb work, especially when uncertainty looms large. That is, when we lack quantifiable knowledge about some possible occurrence, the "unknown unknowns." Everyone who has worked on prediction problems understands that when volatility is high, a simple model such as linear regression can beat a complex state-or-the-art machine learning model. But people often underestimate the magnitude of uncertainty in many real-life situations. Hence, they also underestimate the effectiveness of simple rules in those situations. For example, a computer simulation shows that, for the famous mean-variance portfolio management strategy to beat the simple 1/N rule (split investment equally on all N opportunities), the former will need 500 years of data for model training. By the way, Harry Markowitz, who received the Nobel prize in economics for developing the mean-variance strategy, stuck to the 1/N rule to manage his personal asset. Thus, heuristics are not always "quick and dirty" shortcuts at the cost of quality but valuable tools to help us navigate through the ocean of unknowns. Instead of constantly being on guard against our system 1, we should let it do its work sometimes.

Gigerenzer discussed several examples to show that heuristics work. Some of these examples support his claim quite well, while the others are more dubious. The good examples include how baseball players use gaze heuristic to catch balls, and how pilots of the engine-broken US Airways Flight 1549 used a similar heuristic to avoid collision with NYC skyscrapers and landed on the Hudson River. Gigerenzer's example of most business executives relying on gut feelings frequently in making decisions, however, does not show the effectiveness of heuristics. How do we know these executives made good decisions? As this blog post (https://j-dm.org/archives/2331) commented, "Rules of thumb are often like aphorisms-there always is a counter position to each." Thus, knowing when and where those rules work are crucial.

Gigerenzer's second message is that we can be more risk savvy by understanding the psychology behind risk-taking and risk-aversion. For example, we tend to fear and avoid "dread risks," in which many people die together in a horrible situation, such as air crash, terrorist attack, school shooting, and the spread of novel disease. This tendency is evolutionary rational for people who live in small bands of 20-50 because a sudden loss of many band members would severely affect the survival probability of all. In modern society, however, our fear of dread risks will lead to overreaction. Gigerenzer showed many examples of such overreaction: the US's costly war on terror, the EU's ban on British beef over mad cow disease, and the widely-shared phobia about nuclear power, even though it causes far less death than burning coal.

Gigerenzer also points out that we often learn to fear what our peers fear, resulting in a patchwork of risk taken and avoided. For example, US families frown upon the German practice of burning candles on Christmas trees, but many of them are willing to risk having a gun at home. He lamented that it took ten years for the European Union to pass regulations to make scented lamp oil child-proof, even though the scent enticed children to drink it and caused about 150 death in 10 years. Far much attention was given to the mad cow disease, which caused about the same time number of death in the same period.

Third, people are not doomed to be irrational about risks. Instead, they can make better decisions on risk if they receive more education on statistical thinking and if professionals communicate info about risks clearer. Gigerenzer showed that there is much to be done on both fronts. For example, many doctors do not communicate the risk of procedures to patients. They also fail to help patients interpret positive test results with Bayesian reasoning.

Specifically, to better communicate risks, Gigerenzer suggests: 1) state clearly the reference class, 2) use a "fact box," which visualizes risks via frequencies instead of probabilities and show risk-reduction via absolute instead of relative terms, and 3) state clearly more about uncertainty. On the last point, he quotes research that "there is a theoretical risk" is often not well understood. Instead, governments should learn from the following statements from the UK Food Standards Agency:
"There could be a risk of BSE in sheep. We do not think there is and we are not advising you to stop eating sheep meat, but if you are worried here is what you could do…. We are working on tests to find out whether or not there is a risk, and we will get back to you as soon as we have the results."

Besides the main messages, Gigerenzer also commented extensively on medicine and banking. One thing that piqued my interest is his opposition against the screening of many cancers. His argument is that such prescreening often leads to 1) worry and unnecessary treatment due to false positives, 2) unnecessary treatment for non-progressive cancers (Spiral CT detects as many lung cancers in nonsmokers as in smokers), and 3) X-ray and CT cause cancers themselves. Prescreening does lead to higher survival rates over a fixed time period, but this increase in survival rate is inflated by lead-time bias and over-diagnosis. After replacing survival rate with the correct metrics, total mortality, the benefit of screening is much smaller and dominated by the costs mentioned above. He noted that the US Preventive Service Task Force explicitly recommends against screening for cancers of the prostate, lungs, pancreas, ovaries, bladder, and thyroid. I think Gigerenzer's points make sense, but since I know so little about this topic, I would like to read more about studies on screening.

Although the book conveys important messages, it is not very well written. Some statements are not rigorous. The chapters on management and romance are weak. The book also lacks a logical structure. It was only until page 246 that I found the author's summary of the book's main messages. But if you have only read Kahneman, you got to read Gigerenzer.
This entire review has been hidden because of spoilers.
Profile Image for Jeff Whitlock.
210 reviews5 followers
June 26, 2018
I love the main point of this book: that an answer to building a better modern society does not lie in paternalistic governments that treat people as "stupid," but in helping people make better decisions through understanding risk and uncertainty.

I love Gigerenzer's vision of educating people in health, financial, and digital risk literacy.

While I understood that people are generally "risk unsavvy" (e.g., people won't let their children swim in the ocean for fear of sharks but will let their 16-year-old child drive), Gigerenzer formalized my understanding of why this is the case:

- Zero-risk bias (certainty bias): people treat a risk world as a certain world (e.g., DNA tests)
- Turkey bias: people treat uncertain worlds as calculable risk worlds (e.g., value-at-risk models in finance)
- Dread risks: people are biologically wired to fear instances when many people die at once (e.g., a terrorist attack) more than when people die in a slow, steady manner (e.g., car accidents).
- Risk reference class misunderstanding: people often do not understand what class a specific a risk refers to (e.g., when the meteorologist says, "there is a 45% chance of rain tomorrow," people tend to not understand what exactly this refers to: it will rail 45% of the hours during the day, it will rain on 45% of the region for which they are speaking, 45% of meteorologist think it will rain tomorrow, or it will rain on 45% of days for which this prediction is made)
- Relative risk vs. absolute risk confusion:: people tend to be influenced by relative risk changes when the absolute risk change is inconsequential. For example, early breast cancer screening decreases risk of dying of breast cancer by 20%. This sounds impressive, but the absolute risk change of early screening vs. non-early screening is 1/1000 + the group with early screening had 100 false alarms and 5 women unnecessarily had complete or partial breast removal.

My main criticism is that the structure of the book feels a bit random and scattershot sometimes, particularly in the example. It would have been better with more focus. I also would have appreciated more support for his position that simple heuristics perform better in an uncertain world. I intuitively agree with this point, but I would have appreciated more argument for it and more exposition on why this is the case.

Overall, I think this book is well worth the read.
Displaying 1 - 30 of 183 reviews

Can't find what you're looking for?

Get help and learn more about the design.