Jump to ratings and reviews
Rate this book

Nobody's Fool: Why We Get Taken In and What We Can Do about It

Rate this book
From two New York Times-bestselling psychologists, “an engaging master class in how to foil purveyors of false promises” (Philip E. Tetlock, author of Superforecasting)
 
From phishing scams to Ponzi schemes, fraudulent science to fake art, chess cheaters to crypto hucksters, and marketers to magicians, our world brims with deception. In Nobody’s Fool, psychologists Daniel Simons and Christopher Chabris show us how to avoid being taken in. They describe the key habits of thinking and reasoning that serve us well most of the time but make us vulnerable—like our tendency to accept what we see, stick to our commitments, and overvalue precision and consistency. Each chapter illustrates their new take on the science of deception, describing scams you’ve never heard of and shedding new light on some you have. Simons and Chabris provide memorable maxims and practical tools you can use to spot deception before it’s too late. 
 
Informative, illuminating, and entertaining, Nobody’s Fool will protect us from charlatans in all their forms—and delight us along the way.  

374 pages, Kindle Edition

First published January 1, 2023

83 people are currently reading
1021 people want to read

About the author

Daniel Simons

7 books29 followers
Dan Simons is a professor in the Department of Psychology and the Beckman Institute at the University of Illinois. He earned his BA in psychology and cognitive science from Carleton College and his PhD in experimental psychology from Cornell University. He then spent five years on the faculty at Harvard University before moving to Illinois in 2002.

Simons's scholarly research focuses on the limits of human perception, memory, and awareness, and he is best known for his research showing that people are far less aware of their visual surroundings than they think. His studies and demonstrations have been exhibited in more than a dozen science museums worldwide.

In his spare time, he enjoys juggling, bridge, and chess.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
96 (23%)
4 stars
187 (45%)
3 stars
114 (27%)
2 stars
13 (3%)
1 star
1 (<1%)
Displaying 1 - 30 of 61 reviews
Profile Image for 8stitches 9lives.
2,853 reviews1,724 followers
July 27, 2023
Nobody's Fool is a fascinating and informative look at the multitude of scams, which exponentially increase year on year and will continue to do so, and the art of deception. We, as humans, have an innate tendency toward "truth bias" - we believe what other people say unless they give us a reason not to, and this is what catches out those who fall for a plethora of deceptions. How did Bernie Madoff get away with a $65 billion Ponzi scheme for 15 years? How did Diederik Stapel publish 58 fake social psychology studies? Is it possible for an amateur to cheat their way to a grandmaster title in chess? And why does anyone still answer emails from a “Nigerian prince?”

Fraud, cheating, and scams of various kinds often seem obvious in retrospect, yet people fall for the same kinds of tricks over and over. This book discusses the psychology of how we get taken in, including some cognitive habits that render us vulnerable to deception, as well as several hooks fraudsters use to attract our interest and trust. It includes plenty of analysis and a plethora of examples drawn from cases in a variety of fields, as well as lessons for how to spot and avoid deception—and how to apply insights from psychological science to everyday life. This is a riveting, enlightening and lucid book full of anecdotal evidence and case studies from two of the brightest minds in cognitive science. Highly recommended.
Profile Image for Chris Boutté.
Author 8 books278 followers
July 18, 2023
I don’t know why you’re reading my review when you should be reading this book. It’s so damn good. This is by two cognitive psychologists, Daniel Simons and Christopher Chabris, who worked on the famous invisible gorilla study. I read a lot of books on this topic, and I thought this was just going to be another book about how people fall for misinformation. I was wrong, and I loved every second of it.

The book starts out with the psychology behind why we get taken, and a lot of it has to do with how we default to assuming people tell the truth. Throughout the book, there’s a ton of research that I hadn’t heard about before along with stories of people getting scammed and fooled. While the book does discuss misinformation and other common scams, the book actually takes a ton of shots at academic research. The authors have a ton of criticisms of bad studies, and I loved it so much.

Right now, we’re in a tricky time where if you question science, you’re likely to be called a science denier. But this book goes over so many different ways researchers have manipulated data and how it spreads through the academic community before anyone catches it, and this is extremely important for people to be aware of.

I can talk about this book all day, but for now, just go get a copy and thank me later.
Profile Image for Laura.
803 reviews46 followers
December 24, 2023
Because I've already read a few books about our tendency to "default to truth," Ponzi schemes, and reports discussing scientific fraud, I didn't find the book brought a lot of new information for me. It was fairly well explained however and the authors were smart to end the book with a discussion about when it is worth (time and money wise) to engage in additional scrutiny. However, the examples offered were sometimes given short-shrift and the authors stated some facts that...(ahem)...are not based on truth? The chapter about familiarity discussed at one point the use of endorsement blurbs to sell books. As most users of the GR platform and other online book communities will tell you, blurbs rarely incite a reader (some bookish content creators have put out polls about this, and the majority of thousands replied "no" to blurbs inciting them to buy a book). What most readers look for in a book is a good synopsis, an interesting/attractive cover, and (perhaps this is where the confusion came from) an influencer/friend they trust recommending the book. This error only emphasizes the author's earlier point: be careful about what assumptions you hold, and investigate them.
94 reviews
August 7, 2023
This was a fun book, but it didn't deliver as fully as promised.

The information was well organized and ordered, and the book offered genuine insight into how different methods of deception take advantage of various aspects of our normal assumptions and behaviors without boring the reader with excessive scientific detail.

The examples were fun to read, but the authors relied too heavily on deception discovered in scientific papers and corporate dealings. There was some (not enough) discussion of journalistic shenanigans, but surprisingly little about scams and swindles. For this reason, the book failed to reach a threshold of relevance for most everyday readers.

Finally, I'd note that the authors let their political bias bleed through by consistently selecting examples of word, anecdote, and number manipulation from one side of the political aisle without also highlighting similar behaviors on the other side.
Profile Image for Chipes.
8 reviews
August 23, 2023
I loved this book and want to recommend it to everybody. I think it's important to keep these things in mind, and was especially chagrined to learn that Kahneman made some mistakes in Thinking, Fast and Slow.

However, I am extremely disappointed by the authors' repeated citations of Wikipedia. The book does have tons of citations (the Notes section takes up pages 249 to 312), most of which are of either general-trade books or academic books, or newspapers, magazines, or academic papers, so the vast majority of this book is cited well. But Wikipedia isn't a source. It's an encyclopedia. It either has a source, in which case you should cite the source, or it doesn't have a source, in which case the information is untrustworthy garbage. Also, Wikipedia changes! See my complaint about Ch. 6 Note 26.

Before I list some Wikipedia citations, I will discuss something else I found worrying in the Notes: Chapter 1, p. 22, paragraph 3 includes the sentence "Second, like many 'psychic' performers, Edward likely gathers information about some of his audience members in advance or plants a few stooges in the seats." This paragraph cites Ch. 1, Note 3, which in its entirety reads:
Mentalists using stooges: J. Hitt, "Inside the Secret Sting Operations to Expose Celebrity Psychics," New York Times Magazine, February 26, 2019 [NYTimes URL].

I read the article. The article doesn't mention stooges or plants at all; it's about purported psychics doing social-media research on their ticket-purchasers ahead of the show. So the authors had a citation for one half of the sentence, but gave it to the other half. How many of their citations are this sloppy? I can't verify them all.

The first time I bothered to follow up on a Wikipedia URL was to view the famous survivorship-bias plane diagram, which, along with the Possibility Grid, I think they should have just printed in the book:

- Chapter 1, Note 9 starts by citing a book but then continues:
The details of the Wald story, along with the iconic plane image, are described well in the Wikipedia article "Abraham Wald" [Wikipedia URL]; details about Black Thursday can also be found at "Boeing B-17 Flying Fortress," Wikipedia [Wikipedia URL].


- Chapter 2, Note 3 starts by citing an expert's personal website, with which I have no problem, but then continues
Australian Desktop magazine came to the same conclusion in 2004; see "Killian Documents Authenticity Issues," Wikipedia [Wikipedia URL]

then continues with more legitimate citations, not including Desktop magazine. Why didn't they just cite Desktop magazine? (Oh: As of August 23, 2023, the above Wikipedia article does not have a reference for the only line containing the word "Desktop". Lol. Untrustworthy garbage, then.)

- Chapter 2, Note 32:
It's akin to the famous graphic[....] Wikipedia gives a good summary: "Lincoln-Kennedy Coincidences Urban Legend" [Wikipedia URL].

What if somebody has edited it since then and it is no longer a good summary? Wikipedia is free. At least take a snapshot of the article you're citing at the time you're viewing it and host it on your own website, where nobody can edit it. Then cite that.

- Chapter 5, Note 16 in its entirety:
Leicester City data from "Performance Record of Clubs in the Premier League," Wikipedia [Wikipedia URL] and "Leicester City F. C.," Wikipedia [Wikipedia URL].

How do these scientists, these professors, not know better?! This isn't a finger pointing at interesting bonus reading, this is an actual citation. The paragraph it's justifying (pp. 140-141) is about statistics!

- Chapter 5, Note 22 is a long one that includes citations of the Guardian, the New Yorker, Vanity Fair, and then, oops,
"Fantoni and Nunes Cheating Scandal," Wikipedia [Wikipedia URL], "Cheating in Bridge," Wikipedia [Wikipedia URL], "Fisher and Schwartz Cheating Scandal," Wikipedia [Wikipedia URL].


- Chapter 6, Note 26, in its entirety:
The use of "ph" rather than "f" might be an allusion to the repeated "ph" in an earlier form of hacking known as "phone phreaking": "Phishing," Wikipedia [Wikipedia URL].

This is where I became frustrated and shouted at the book "Just cite the source!" Then when I opened up the "Phishing" Wikipedia article to find the source, the information was not there! Go to that page and Ctrl+F "phreak". Go to that page and Ctrl+F "phr". It wasn't there on August 16, 2023, just 43 days after the book was released, and it isn't there as I write this on August 23.

- Ch. 7, Note 29 begins
Historical world records from Wikipedia: "Women's 100 Metres World Record Progression" [Wikipedia URL]; "Men's 100 Metres World Record Progression" [Wikipedia URL].

Wikipedia is citing something that won't load for me (Aug. 23, 2023), so does that count as good information?

- Ch. 8, Note 1 first cites an Atlantic article but then continues,
The Wikipedia entry on Calloway has a lot more detail about her background and claims: "Caroline Calloway" [Wikipedia URL].


- Ch. 8 Note 5: Literally
Sources for Clark Stanley's story: "Clark Stanley," Wikipedia [Wikipedia URL];

then the Smithsonian Institution, then NPR. The Wikipedia article, as of Aug. 23, 2023, cites the NPR story, plus a book called "Rattlesnakes", plus "Service and Regulatory Announcements" from the United States Bureau of Chemistry, plus an inflation calculator.

So. What the hell, you guys. Is this a new academic moré I'm not aware of, or is this egregious?

Three stars off for the bad citations, one star on for the .
Profile Image for Eduardo Montiel.
230 reviews5 followers
August 29, 2023
An illuminating and entertaining exploration of the human mind's vulnerability to deception and cons. Through insightful analysis and witty storytelling, psychologists Daniel Simons and Christopher Chabris show us how to avoid being taken in and describe the key habits of thinking and reasoning that serve us well most of the time but make us vulnerable—like our tendency to accept what we see, stick to our commitments, and overvalue precision and consistency.

This book is sprawling and covers various types of “cons”: phishing scams, Ponzi schemes, fraudulent science, fake art, chess cheaters, crypto, marketers, magicians... Some sections are more important than others, but in general it’s a great book that offers practical tools to help us become more rational decision-makers.

My background is in behavioral economics, and I was familiar with most of the cognitive tricks covered in the book. I particularly recommend the full possibility grid as an important tool that everybody should consider using when imparting serious critical analysis.

4/5. Aug 2023. I appreciated and was surprised with some of these debunks: Gladwell’s tipping point fallacy (faulty causation); luminosity / video game lack of positive effect on cognition; Kahneman's metaphor-driven priming (small sample size); the ineffective nature of nudging (big results don’t usually come from small changes).

Main takeaway: The more people rely on intuition and the less skilled they are at analytical thinking, the more impressed they tend to be with statements so nonsensical that they could be neither true nor false.

Experts in any topic can recognize and interpret more patterns than novices do, so they have a keener sense of when to be suspicious. Their greater knowledge lets them spot bullshitters who only act as though they know their subject. That’s how chess masters deduced that the John von Neumann who entered the World Open wasn’t a good chess player.

Notes
Humans operate with a “truth bias”—we tend to assume that what we see and hear is true until and unless we get clear evidence otherwise. We hear now, believe right away, and only occasionally check later.

“Trust, but verify.” Paradoxically, the more convincing a speaker seems—the more correct and self-evident their arguments feel—the more we need to investigate further.

If the organizers failed the M&M test, the band paid more attention to the staging, and in Roth’s words, “We’d line-check the entire production. Guaranteed you’re going to arrive at a technical error.”(Van Halen stress test)

We all develop expectations for base rates—what things are common and what things are rare—based on what we experience or hear about the most.

4 Hooks
1. When the information we encounter matches or resembles what we already know and trust, we use familiarity as a signal of its truth.
2. We rely on the consistency of information we encounter as evidence of its veracity.
3. We associate great precision in predictions or evidence with the accuracy and truthfulness of the ideas that gave rise to them.
4. We are attracted to stories of potency, in which small causes have large consequences for our lives and society as a whole.

Survivorship Bias: “What’s Missing”, Full Possibility Grid
Wald’s analysis of the B-17s helped lay the groundwork for the concept now known as survivorship bias. We tend to devote more attention to cases that are still around, neglecting those that are not.

In fact, many business schools structure their curricula around the analysis of case studies of successful companies, leaders, and decisions. But this practice is much like studying only the planes that returned.

Maybe companies with better products, higher sales, and more profits are simply more likely to try the newest marketing ideas. (That is why all the breathless anecdotes about how Google pampers its workers, how Amazon runs its meetings, how Finland’s teachers plan their lessons, or how the US Navy Seals operate will tell you almost nothing about what it takes to become an elite performer in the first place.)

The trouble, of course, is that it’s in our nature to be attracted to—and convinced by—a good story. Stories of marketing wizards and investment geniuses sell lots of books, but when we’re drawn in by a good story, we don’t think about what it leaves out.

To counter this problematic mental habit, we can ask, “What’s missing?” Doing so before making a key decision reminds us to ask what information we actually need in order to evaluate the truth of what we’re being told.

When we think of the full possibility grid, we see the success stories in the top-left box in the context of the other three boxes, which can leave us much less impressed by the handful of incidents or anecdotes that happen to wind up there.

The possibility grid is a universal weapon to draw attention to what is absent. Once you master its logic, you will start to notice so many uses for it that you will wonder how you got along without it for so long.

Whereas we tend to focus on the information we see and not the information that’s absent, the scammer has all the information.

The “law of attraction,” often referred to as “manifesting,” states that what you think about will happen. If you were thinking of your friend and they called you, it was because you were thinking of them; if you think of bills, you will get bills…. (you don't think of all the times it didn't come true)

A résumé of failure can also track things we got away with but probably shouldn’t have, times when we should have succeeded but were unlucky, and even things we considered doing but passed on. Looking at this more realistic type of résumé helps us recall actions and events that we otherwise might forget or ignore but that are essential if we want to evaluate what does and doesn’t matter for success.

An alternative strategy, born in the military and intelligence worlds and recently adopted in the sciences, involves bringing in colleagues to serve as a “red team” whose goal is to spot errors in your thinking.

Inviting critics into your tent—a process known in science as adversarial collaboration—may not come naturally, but it can pay big dividends.

Benford’s Law
A principle called Benford’s law describes a regular pattern that results from randomness whenever a value can grow indefinitely and the range of possible values spans at least a few orders of magnitude. It holds true in domains ranging from the volume of lakes to sales revenues to follower counts on social media.

As Nigrini and others have shown, many cases of accounting fraud are detected in part because cooked books contain numbers that deviate from Benford’s law.

Unreliable Memory
In reality, as 150 years of scientific studies have shown, memory is reconstructive as well as recollective. When we have the feeling of retrieving a memory, we sometimes are constructing a version of a past event that combines information from different sources. What seems to be a single coherent memory can be a mash-up of experiences that happened at different times and in different places.

The notion that reality was changing and fragmenting so that different people were truly experiencing different sequences of events (rather than misremembering the same sequence in various predictable ways) became known as the “Mandela Effect,” and it is increasingly cited by people as a justification for holding on to vivid memories that depart from what is generally recognized as reality.

“What am I assuming?”
To guard against being deceived in these and similar ways, ask yourself, “What am I assuming?” before making big purchases, agreements, or investments and before drawing conclusions. Explicitly identifying relevant commitments and reframing them as tentative assumptions is the only way to systematically evaluate whether our decisions rest on shaky foundations.

The “bullshit asymmetry principle” states that the amount of energy needed to refute a heap of nonsense is an order of magnitude greater than that required to produce it.

Statistical Analysis
Before computing the statistics that will say whether an experiment “worked,” we are supposed to look at how the numbers are distributed, how noisy or smooth the lines are, and whether there are observations that point to potential flaws in our methods.

A surprising number of highly touted, publicly traded companies lack clear business models or simply do not do what they claim to do, yet investors buy their stock without asking the right questions.

Aagaard tells students to open doors by asking themselves three questions: What is the worst-placed piece on the board? Where are the weaknesses? What is my opponent planning? This checklist almost always leads to an idea for a good move to play.

Asking “What better options do you have?” or “What are your two best options?” might work better than asking “Are there any other options?” which invites a “No” response.

Ponzi Schemes / Madoff
A Ponzi scheme is now defined as a business structure in which early participants profit at the direct expense of later ones. Commonly, victims think their money is being used to trade real assets. In reality, some is being stolen by the operators, some is paid to other investors as “profits,” and the rest is held in reserve to pay for future withdrawals so the scam can continue.

The consistency of a smooth upward trend was the Madoff scheme’s unique “value proposition.” He returned 7–14 percent each year, even though the overall market fluctuated from annual gains as high as 37 percent to losses as great as 25 percent during the years of the scheme. Consistency removes the discomfort of uncertainty and eliminates fear of risky negative outcomes. (solves risk aversion when facing losses)

The proliferation of Madoff schemes likely results from another aspect of our preference for consistency beyond risk aversion and loss aversion: our poor understanding and unjustified dislike of consistency’s opposite, “noise.”

In any complex system, where many factors contribute and interact, performance in the short term should vary a lot; we shouldn’t expect to see long-run averages reflected perfectly in short-term returns. Asking “Where is the noise?” can trigger us to investigate suspiciously smooth performances.

Familiarity: The Recognition Heuristic
The recognition heuristic is a rule we instinctively use to evaluate which of two options we should prefer—regardless of the context. The rule simply says, “When in doubt, pick what you recognize.”

Even if all book endorsements were genuine and the blurbers truly believed everything they wrote, we have no idea how many people were asked to endorse the book and declined. As we noted in our discussion of the principle of focus, rapturous blurbs—and job references and letters of recommendation—tell us nothing about the neutral, negative, or unwritten reactions we weren’t shown.

And if there are few or no negative reviews and a ton of positive ones, you should take the positive ones less seriously.

According to an analysis in the Wall Street Journal of the performance of thousands of mutual funds over a period of decades, only a small percentage of funds receiving Morningstar’s highly touted five-star rating performed well enough to still have that rating five years later. In fact, about the same number of five-star funds became bottom-ranked one-star funds! Past performance never guarantees future returns, but in this case, it barely even predicted them.

When confronted with a precise claim, cost, or value, we can ask a question. If the number is meant to be impressively large, ask, “Is that a lot?” If it is meant to be surprisingly small, ask, “Is that a little?” Depending on the nature of the claim, we can ask follow-up questions.

Concrete vs Abstract Ideas
We have an easier time remembering concrete information because it is stored in memory using both a verbal code and a pictorial one. Abstract ideas, by contrast, don’t call to mind specific, universal images. Concrete words like “quail,” “sex,” and “cocaine” activate regions at the back of the brain that handle visual information. Abstract words like “science,” “research,” and “addiction” activate the frontal lobes, which process information independent of any particular sense.

Concrete words are also better able to trigger strong emotional responses, which strengthen our memory for them.

Accuracy & Precision: Don’t be fooled by predictions
Accuracy and precision, though often confused, are fundamentally different concepts. An accurate measuring tool gives something close to the right answer on average. A precise measuring tool gives a detailed, consistent answer, regardless of whether it’s right or wrong. The claim of an exact, optimal positivity ratio of 2.9013 has more “truthiness” than truth—it is a precise claim, but that precision can give us the false impression that it’s also accurate.

In psychology and other social sciences, many studies lack a suitably powerful telescope to provide reliable answers to their empirical questions.

In reality, personality is less like a distinct category and more like a constellation of many traits, each of which can vary over a wide range, resulting in an explosion of unique combinations.

Small Sample Sizes
Being guided by a tiny sample of recent experiences is the worst way to use data; we almost never have enough evidence for a reliable conclusion, but we always have enough to be fooled.

If you happened to pick two good stocks that outperformed an index fund, do you have enough evidence that you can consistently beat the market?

Replication Studies: small causes usually don’t have strong effects, butterfly effect fallacy
But in the years since Kahneman insisted that these metaphor-driven priming results were beyond question, many have been subjected to independent attempts at replication, and none has emerged unscathed.

Kahneman later admitted that he had been wrong to place so much trust in “the results of underpowered studies with unreasonably small samples” and that he had blinded himself to their implausible potency. (i.e. walk slower when primed by “aging”)

Social psychologists and researchers in marketing, consumer behavior, and the branch of applied behavioral science known as “nudging” like to study the effects of these sorts of subtle wording interventions on important, real behaviors and judgments. If they worked, they would yield practical benefits at little cost. But their true effects almost always range from small to nonexistent.

Complex problems usually require multipronged solutions, if they are solvable at all, and rarely yield to the proverbial “one simple trick.” We should meet any claim that contradicts this principle with a demand for the strongest level of evidence.

When any product or process is said to have powers that are surprisingly broad, unique, or out of proportion to its cost, our deception detectors should start to tingle. If Nobel Prize winners can be taken in by poorly supported assertions of potency, so can the rest of us.

“Could this even be false?” Try replacing the abstract platitudes and complex words with simple, concrete ones that convert an obscure claim into an easily understandable one.

AI Debunked
What seems like intelligent conversation often turns out to be a bull session with a bot whose cleverness comes from ingesting huge volumes of text and responding by accessing the statistically most relevant stuff in its dataset. A bot has no commitment to telling the truth because its code does not incorporate the concept of truth.

Always keep in mind that—like current examples of artificial intelligence—human expertise is limited, not general: It provides a big advantage in a small domain.

Postmortem
The decision scientist Gary Klein described a similar process he called the “premortem.” Before starting a project, agreeing to a deal, or making a big investment, ask yourself, “If this goes horribly wrong, what will be the most likely reason(s)?”

Balancing Cost & Benefits of Preventing Fraud
You too should consider whether it would be better both for your pocketbook and for your peace of mind to avoid sweating the small stuff and accept that you’re bound to be fooled once in a while. Is it possible that the cash register didn’t give you the discounted price? Sure. Is it worth checking every line on your receipt every time you shop to verify that each price was correct to the penny? Perhaps not.

Many organizations fail to properly balance the costs and benefits of checking; they don’t do the calculation to determine whether the cost of fraud prevention is worth it. They might even spend more money establishing policies and enforcing compliance to prevent fraud than the fraud itself costs them.

When the consequences of a potential fraud would be significant, in addition to checking more, we should try to think like the scammer.

Profile Image for Sahar.
294 reviews3 followers
July 15, 2025
good thing my anxiety makes me second guess everything!!!!!!
Profile Image for Lilli.
108 reviews
April 9, 2024
Good read, a little repetitive. Good information though.
1,191 reviews7 followers
November 29, 2023
Good overall view of how our biases give us incorrect information. Nothing really groundbreaking.
Profile Image for Cheryl.
606 reviews3 followers
February 24, 2024
I would rate this book 3 1/2 stars. It was an interesting look at how we are all fooled, no matter how smart and analytical we may be. There are lots of examples of how hucksters fool us, how we choose to believe the familiar even when it really doesn’t make sense, and how people who we believe to be reputable, and our friends (think Bernie Madoff) are able to manipulate and fool us.


The authors give a lot of good advice. For example:

“To make sense of our world, we rely on our experiences to predict what will happen next. When we are wrong, we revise our expectations accordingly. But when our experiences match our predictions, we tend not to question them, and we can be deceived by people who take care to make our predictions come true.” The chapter entitled “Prediction – Expect to be surprised” provides a number of examples and strategies for realizing when we are not thinking carefully enough about what we expected to happen.


“We often interpret consistency as a sign of quality and genuineness, but authentic data almost always include some variability, or “noise.” Looking for a realistic level of randomness and change can help us avoid getting fooled.”


“We rely on a sense of familiarity as a rough and ready indicator of truthfulness and legitimacy. When something rings a bell, but we don’t know why, we should consider the possibility that it is merely similar to the real thing, and someone might be deceiving us.”


The authors include lots of examples of real-life situations where people were fooled, including very highly educated, very smart people who were duped.

My one criticism of the book is that the writing was rather dry. There really wasn’t a lot of wit or humor in the writing. As a result, the book could be a bit dull at times, unlike Malcolm Gladwell who writes about topics like this in a very witty style. But there is a lot of worthwhile information in this book.
Profile Image for Stephen Haines.
230 reviews7 followers
July 20, 2025
I found very little about this revelatory or even all that interesting. The only bits I really enjoyed were to do with history which is often recounted here to illustrate one or another examples of whatever deception they’re talking about.
Profile Image for David.
420 reviews31 followers
August 5, 2025
An entertaining and useful book. Simons and Chabris discuss ways in which our thinking can go wrong illustrated by interesting scams from people who’ve intentionally exploited these weaknesses. They make very clear that these biases and heuristics are sensible and even necessary in everyday life, so they try to help the reader understand when they really need to pay attention and break out of these system 1 ruts and engage system 2 thinking.

I very much enjoyed their slam of the sloppy Malcom Gladwell (p. 31), and also their taking Daniel Kahneman to task for his overly credulous reporting of priming effects in his Thinking, Fast and Slow (pp. 210–215). This is a good illustration of the fact that we can all be fooled. If Kahneman can fall prey to believing unreplicated underpowered studies with surprisingly large results, we can all be fooled. But Simons and Chabris give us some tools to be fooled less often.

One small issue I had: they keep talking about decimal places, as in “the four-decimal-place precision of their positivity ratio” (p. 189). In my area of science we exclusively talk about significant figures, not decimal places, because the latter is a great example of denominator neglect. 2.145 kg is not more precise than 2145 g, even though the former has three decimal places and the latter none.

Quotable quotes:


Just as committing to an idea can reshape our view of the world, committing to a person can shape the way we think. The concept of trust is often used to explain why people fall for frauds and scams. In our analysis of the factors that make us vulnerable to deception, trust is not a cognitive category of its own; we see it instead as a type of commitment. When we trust a person or organization, we assume that they tell the truth and we fail to scrutinize their claims or apply the amount of critical thinking we deploy against sources we don't trust—sources we do not assume are telling the truth. [p. 85]



the curse of knowledge, reflects the difficulty in imagining and keeping track of what other people don't understand. People are usually reluctant to interrupt a speaker (especially a higher-status one) to ask for clarification for fear of revealing their own ignorance. Without that feedback, we rarely notice our curse of knowledge, and we can fool ourselves into thinking we have conveyed information that we haven’t. [p. 87]



When discussing the nature of instincts, William James wrote, “It takes, in short, what [philosopher George] Berkeley calls a mind debauched by learning to carry the process of making the natural seem strange.” Making the natural seem strange means defamiliarizing ourselves with something by temporarily setting aside what we know in order to evaluate new information more objectively, to see what it actually means. [p. 177]



Kahneman later admitted that he had been wrong to place so much trust in “the results of underpowered studies with unreasonably small samples” and that he had blinded himself to their implausible potency: “I knew all I needed to know to moderate my enthusiasm for the surprising and elegant findings that I cited, but I did not think it through.” [p. 214–215]



Robust conclusions usually require much more data than we think they do. [p. 219]



A recent meta-analysis shows little evidence that brief interventions designed to instill a growth mindset have any real effect on academic performance, the main focus of the mindset movement. [p. 220]



Page references are to the hardback edition with 327 total pages.
Profile Image for Kam Yung Soh.
956 reviews51 followers
January 20, 2024
A fascinating book that looks at why people often fall for tricks and scams that, usually on hindsight, appear so obvious. As the authors explain, it is due to our natural tendency to believe what we see or accept what we have being told as the truth. What this book does is show how scammers take advantage of this tendency, and also try to equip the reader with the necessary ways to look closer or dig deeper for more information to reveal the scam.

The book is divided into two parts. The first part covers the habits we use that make us fall for scams. These habits are:

- focusing too much on what is being presented. This leads us to exclude or ignore other information that would reveal the scam. A prime example is survivorship bias, where we only have information on those who make it (how to be a non-graduate billionaire), but not on those who don't (excluding the experience of numerous non-graduates that never go on to become billionaires).

- predictions that follow our expectations, leading us to not scrutinize the actual results, leading to researchers to falsify data so that the results are what are expected for the study to fulfil their research grants or get better positions on the strength of the studies.

- being committed to our version of events, leading us to discount evidence that contradicts it. This could lead to false memories of events and belief in conspiracy theories because reality doesn't match the false memory. We also become more critical only of views that don't match our expectations.

- efficiency is the tendency for us to accept things at face value, rather than trying to look for more information to back up the views, like accepting that some people may be good at chess or exams without realizing they may be cheating. If we are already committed to a course of action, even if it is a scam, we may find it hard to back out, because the 'sunk costs' makes it more efficient to continue the course of action.

The second part looks at the hooks scammers use to make us believe whatever is being presented.

- consistent data that makes us believe that something must be correct (because it is consistent). Only, in real life, there is always noise in data, so data that is too consistent should be considered suspicious.

- familiarity with the way things are advertised allows scammers to present something that looks familiar but is actually a scam: fake websites that look similar to actual bank websites, for example.

- precision makes us think the results must be true because the results are so precise. Perhaps too precise: studies that present impossibly precise results should be suspect, as the data cannot back up the precision.

- the potency of a social intervention to make us believe it must be valid since it has such a huge effect. Again, the data may show that this potency was exaggerated or only done on a small scale and thus, may not be valid for the general population.

The authors also present ways to see through the scams, from asking for more data, asking the right questions or looking at the scam presentations in a new way that would reveal the contradictions or missing information that would show that what is being shown is not the truth.
Profile Image for Dana.
Author 27 books53 followers
August 2, 2023
First, a disclaimer: Christopher Chabris, one of the co-authors, is a Facebook friend of mine. His book was also published by the same publisher as my book and he had the same editor! So I came to this book with high expectations, and I was not disappointed.

Swindles, scams, con games, Ponzi schemes: these are things you read about, and mostly they seem like things that happen to other people. "I would never fall for that," you might say. Don't be so sure! Simons and Chabris explain the flaws in our mental processing that con artists take advantage of. Habits like focusing on the wrong thing, or accepting claims that seem consistent with what (you think) you already know. We take certain short cuts in our thinking because it's more efficient and 99 percent of the time it does no harm. Some flaws, like trusting other people, are perhaps not flaws at all. In order to survive we have to trust some people.

I guess that one question I had before reading this book was: how common is this, really? It's impossible to be sure, because a con artist's livelihood depends on not being detected. But Simons and Chabris have done their work and assembled a truly massive collection of examples of flimflam. I honestly suspect they had a little bit of fun researching all the clever ways people have been duped. (I also like reading about the clever ways the dupers have been caught.) The sheer volume of their research is one of the most valuable things about "Nobody's Fool." When you read *one* news item about a fraud, say the Theranos fraud that took in so many wealthy investors, it's easy to brush it off as something freakish or unusual. But when you read case after case of people being deceived in the same ways, it becomes much harder to brush off.

Another thing that I liked about this book was that it was written with humor. It would be easy for a book like this to be depressing, but Simons and Chabris somehow manage to keep it light while still being informative. One of my favorite Easter eggs was this quote on page 163: "Instilling a false sense of familiarity is a tactic usable by anyone trying to make products or solicitations seem trustworthy. Even book authors can take advantage of familiarity, for example by adopting the title of a famous novel and award-winning film for their nonfiction book." Get it? (If not, look up "Nobody's Fool" on Amazon and see the books and movies that come up.)

Even though that comment was just a joke, it does make you think of the fact that we are surrounded by advertising all the time, and it's all just a low-pressure version of a con. We've consented to live with it, and we are all victims to some extent.

My only slight criticism of the book is that there wasn't a lot of difference between the chapters. I could sum up each of the eight chapters pretty well in one sentence: Ask More Questions. The questions they recommend vary from chapter to chapter, but what you really need to avoid scams and cons is a questioning mindset. If you are by nature a skeptic, maybe you won't even need this book very much. But for the rest of us, it's a great reminder to not believe everything we're told.
Profile Image for Don Moore Moore.
6 reviews2 followers
August 14, 2023
Dan Simons and Chris Chabris have masterfully crafted a book that explores the inner workings of human trust, intuition, credulity, and confidence. The result is stupendously insightful, delightfully entertaining, and incredibly useful.

From the very first page, I was ensnared by the authors' incredible ability to seamlessly blend scientific rigor with captivating storytelling. "Nobody's Fool" not only challenges our preconceived notions of how our own vulnerability to deception, but it does so with an unparalleled elegance that makes even the most complex concepts accessible to readers. I was particularly impressed by their honest and insightful discussion of scientific evidence, credulity, and our vulnerability to fraud. The authors are courageous in discussing cases of scientific misconduct; they bravely call out the individuals and institutions that shamelessly profiteer off lies and shoddy research findings.

I loved “The Invisible Gorilla” and, if it is possible, I was even more captivated by “Nobody’s Fool.” The synergy between Dan Simons and Chris Chabris is nothing short of magical. Their writing style is a harmonious blend of wit, clarity, and intellectual curiosity that transforms even the most complex psychological phenomena into an engaging and enlightening adventure. I am in awe of their ability to weave together personal anecdotes, fascinating examples, and cutting-edge research to create a text that is both enlightening and entertaining.

Simons and Chabris guide us through a journey that uncovers the astounding limits of our attention, perception, and awareness. Each insight is complemented by a compelling story that makes the lessons come to life. The authors' insightful exploration of the tools of deception left me more aware of my own mind’s functioning and my own vulnerability to scams.

"Nobody's Fool" is an extraordinary opus that sets a new standard for non-fiction. Simons and Chabris have gifted us a book that shatters illusions, challenges conventions, and invites us to explore the very fabric of our perceptions. Prepare to be captivated, enlightened, and forever changed by the remarkable insights that await you within its pages.
Profile Image for David.
780 reviews16 followers
August 23, 2023
A great book from the duo who developed the highly referenced Invisible Gorilla video.

Researched over more than 10 years, there are a ton of case studies, stories and anecdotes of frauds and scams from many fields including finance, politics, cryptocurrency, science and business.

The book is divided into 2 parts: Habits and Hooks.

Habits are cognitive habits of how we think and reason that can be weaponized by people who want to fool us while Hooks are features of the information we encounter in our daily lives that we find attractive that can snare us.

The following 4 Habits and 4 Hooks are covered together with the suggested counter:
Habit 1: Focus - ask What’s missing?
Habit 2: Prediction - ask Did I predict this?
Habit 3: Commitment - ask What am I assuming?
Habit 4: Efficiency - ask more questions - What else can you tell me?, What information would get you to change your mind?

Hook 1: Consistency - ask Where is the noise?
Hook 2: Familiarity - ask Why does this ring a bell?
Hook 3: Precision - ask Is that a lot? or Is that a little?
Hook 4: Potency - ask What’s the active ingredient? or Compared to what?

The conclusion suggests a few more questions we can ask:
- Could this even be false?
- Am I making a simple mistake?
- If this goes horribly wrong, what will be the most likely reason(s)?
- Is the cost of ensuring that you’re not being deceived balanced appropriately against the pain you would suffer if you were?

A must read not just to understand how we fall into deception but diving deeper into the question of why by uncovering our cognitive biases and how to overcome them by slowing down and asking targeted questions.
Profile Image for V.
836 reviews5 followers
November 3, 2024
Useful information for the gullible.

(Hint: we are all gullible sometimes.)

I consider myself to be some combination of skeptical and risk-averse so I rarely get myself into actual trouble by believing something that isn't true. But sure, I get taken in by disinformation from time to time, especially when it's something I want to believe. Knowing the specific ways people are vulnerable--what kinds of information (and from whom) they are more likely to believe--could help people avoid being victimized.

I especially enjoyed reading about how scams and fraudulent datasets have been discovered and disentangled (e.g. using Benford's law) and that some of the same principles might be at work both in service of scammers and in service of those who would attempt to stop scammers. For example, a scammer might use the very tired Nigerian Prince con (itself just a modern variation of the centuries-old Spanish Prisoner) in order to filter out all but the last few people who haven't heard of it, thus saving the time they would have spent engaging with a more canny mark who never would have paid. Persons detecting such a scam might, conversely, spend a lot of time engaging with a scammer in order to waste the time of that individual and try to get the scammer to find a new line of work. (With bots, I imagine this vigilantism becomes much less effective.)
Profile Image for Dallin.
6 reviews
December 6, 2024
I really liked this book. The topic was interesting and got me thinking. I appreciated several of the practical recommendations like the four quadrants.

My only critique is that I was left with the feeling that the approach this book seems to espouse can be somewhat close-minded and dismissive of anything that falls outside established norms.

For example the author seems to disregard any alternative approaches to healing, fairly citing a lack of clinical trials or FDA approval. My issue is that just because we don’t completely understand something or it hasn’t been researched extensively doesn’t mean there can’t be value or potential benefits. (To be clear I’m not talking about ivermectin, I don’t want to wade into that debate.)

I think of Ignaz Semmelweis, who faced staunch resistance to hand washing, in part because germ theory wasn’t developed until decades after his work. How many lives could have been saved if people were more willing to look at the evidence and accept his proposals?

In the end, I guess this may be one of the main points the author makes - we must be willing to examine the evidence, think rationally, then make careful and logical decisions.
Profile Image for Kate.
1,125 reviews43 followers
September 1, 2024
Both insightful and informative, this book lays out a number of cons, the techniques used that made them successful, and how these tactics are used on the average person regularly. We are all vulnerable to being taken in, some often and others less so, and understanding how and pausing before making informed decisions based on intuitive is vital to decreasing these opportunities. I loved the section that discussed how things that feel intuitively right and fall in line with what we would think are the most likely to be inaccurate and misinformed because we don't check as studiously and are quicker to accept it. Anyone can fall for this, including experts in their own fields, so it is extremely important to realize this short coming in all of us as humans and make sure we are evaluating information coming in that affirms our own beliefs and opinions with greater scrutiny than information we disagree with. Overall, a fantastic and important read. I wish that it had gone a bit more in depth in certain areas, but an interesting read nonetheless.
Profile Image for Charles Reed.
Author 334 books41 followers
February 6, 2025
85%

It can even have morality in science, apparently. A lot of non-validated, non-verified studies out there that people are quoting from books that I've read, even. And actually, I've read some of this stuff, I just don't think it's realistic to think it's true. But man, I can see why a book like this would be freaking skimmical. Excuse me. Skeptical. Because with there being that many negative... They're teaching you things, and then you find out that they're fake. Like, oh man, you suck. I mean, you gotta laugh about it, though. You can cry about it if you want, I suppose, but what's that gonna do? Man, that's the human condition. So many stammers. It just bums me out a little bit. So I have to laugh extra hard that it's even in scientific studies. You can't have morality and rigor even in a field that's specifically devoted to explaining that 1 plus 1 equals 2 and not 3. Yikes. I'm sure I can assume that most studies are good, but oh man, I even heard about some of those studies. Oh, that's a huge bummer.
Profile Image for Jeffrey Rasley.
Author 19 books42 followers
July 31, 2025
This is an engaging book by two academics, a psych prof and a neuro-scientest about "the art of the steal." It is a lesson in how and why people are taken in by scammers, hucksters, con men, and politicians. They describe the different techniques used by the bad guys, why those techniques work, and what we can do to protect ourselves from getting screwed. At root of all of it is our instinct to "trust our gut" (remind you of a successful con man turned even more successful politician?) rather than being skeptically analytical.
Also read Kurt Anderson's "Fantasyland: How America Went Haywire: A 500-Year History" for a delightful history of bizarre religious and political movements from the Colonial Era through Trump that have duped American rubes.
Profile Image for Rajesh.
31 reviews
April 14, 2025
Nobody’s Fool by Daniel Simons and Christopher Chabris is a sharp, engaging must-read that cuts through the noise of deception with clarity and wit. The authors masterfully blend psychology and real-world scams like Madoff and Theranos to reveal why we all fall for cons, from phishing emails to fake science. Their four cognitive traps (narrow focus, truth bias, premature commitment, and efficiency over accuracy) hit home, backed by compelling stories and solid research. The practical tools—pause, verify, question—are actionable and empowering, though they demand discipline to apply. This isn’t just a book; it’s a wake-up call to think smarter in a world of trickery.
Profile Image for Carlo Corti.
681 reviews5 followers
September 10, 2023
Simons is a great researcher, but there just wasn't a lot of new ground covered here. Authors spent a lot of time talking about Theranos, Madoff, and other big scams, but didn't offer anything really original in terms of how to avoid being scammed. I guess the advice just seemed obvious, but I liked there emphasis on the fact that we do have a truth bias--because that makes our lives a lot more efficient.
Profile Image for Tim Dugan.
718 reviews4 followers
July 16, 2023
Some of the early parts were a bit derivative with other books on related topics. Such as the Van Halen M&m story that I’ve read about a dozen times

But other than that it’s mostly good information

But I wished for more

Especially what would be timely would be more information about how not to be fooled in politics
Profile Image for Janice.
2,183 reviews2 followers
September 23, 2023
Interesting. Very focused on scientific research and how this can be used in a scam. And sure Madoff and some art forgeries mentioned but I had hoped for some information on the day to day for the rest of us.

It was pretty much to watch for things you might be biased toward. It might have been fun to use some examples of frauds we may have heard about.
This entire review has been hidden because of spoilers.
Profile Image for Lorrie.
720 reviews1 follower
April 28, 2025
Fakers gonna fake, fake, fake, fake, fake.

This book details how scammers scam, why people believe them, and gives us some ways to think about things so that maybe, just maybe, we won’t get sucked into that scam. It was very interesting, and used some well known examples - Elizabeth Holmes and Theranos came up a few times, as did Bernie Madoff. Highly recommended.
Profile Image for S.
192 reviews
May 28, 2025
It took the authors a little over two hundred and fifty pages to define and explain fraud to the reading using many Bernie Madoff examples. In my opinion, this book was fifty percent too long. It began in an interesting way, but quickly lost me because some illustrations appeared repetited and as I mentioned before Bernie Madoff was utilized too many times to make their point.
Profile Image for Tom Coleman.
Author 2 books14 followers
July 14, 2023
Wonderful nonfiction book which explains how scams use our brains’ “blind spots” to trick us into trusting. Very readable and meant for the general public, it’s both interesting and funny. I literally read the whole thing basically nonstop in two days! Loved it!
Profile Image for Sharron.
2,430 reviews
August 18, 2023
Thoughtful points but at times the statistical data made my eyes glaze over particularly in chapters 7 and 8. However, Chapter 6 Familiarity - Discount What You Think You Know was excellent. If you read nothing else in this book, read that chapter.
260 reviews4 followers
November 26, 2023
Covers ground that anyone keeping up with behavioural science would be familiar with
The angles presented are different and refreshing and a healthy reminder to look out for common traps relating to confirmation bias, or not looking for counter-factuals etc.

Worth a read
Displaying 1 - 30 of 61 reviews

Can't find what you're looking for?

Get help and learn more about the design.