First, the author. William Poundstone is a professional author with 15 books in his publication list. They’re mostly about science and the issues surrounding science and the philosophy of science. “Gaming the Vote: Why Elections Aren’t Fair” or “Rock Breaks Scissors: A Practical Guide to Outguessing and Outwitting Almost Everybody.” He also wrote “Are You Smart Enough to Work at Google,” which makes him an interestingly edge-case writer (especially since I work at Google).
Poundstone’s day-to-day thinking and writing is about what people know and how well they can use that knowledge to understand the world. That’s what this book is really all about.
It’s really a book-length treatment of a number of surveys he did about what people know about the world, and how that influences their income over time. Although he’s not precise about the number of surveys, or when they were run, or how he set up the studies, if you’re willing to believe that he ran the surveys correctly, then the results are damnably frightening. In short, there are two big findings: (1) people don’t know a great deal about the world (broadly speaking), and (2) a breadth of knowledge is correlated with happiness, health, and wealth. That is, as the title suggests, you still need to know facts about the world in order to navigate it with any kind of depth of understanding or efficiency. The more you know, the better off you are in many ways. This is called the “Knowledge Premium,” the benefit of knowing a lot about many things.
Now, I agree with that, but it’s a tough proposition to prove. So I have a bit of potential confirmation bias working here, but at least I know about the existance of confirmation bias, so I can watch out for it. If you don’t know about confirmation bias (or the availability bias, or the Dunning-Kruger effect, you’re working without much awareness of how your brain works… and that can lead to bad outcomes).
He starts the book with a detailed telling of the Dunning-Kruger effect (which is that if you have low ability, you over-estimate your ability—it’s difficult to estimate your own knowledge accurately—they call this “illusory superiority”). That is, most people rate themselves as above average in almost any area of expertise. And, the less competent you are, the more this is true.
The book is full of data about how people understand rather little. Example: Almost everyone “knows” Einstein, but only 48% can say what he did for a living; 30% know who devised the theory of relativity. Likewise, only 9% of people can name Frida Kahlo’s husband, while 5% identify her as the painter of “American Gothic.” The don’t know the size of the federal budget, or the difference between the federal deficit and the federal debt. It’s sobering.
The second section is about the Knowledge Premium—with many of his own surveys showing a correlation between broad knowledge, wealth, and overall benefits in life.
The last section is about “Strategies for a Culturally Illiterate World,” that is, ways to get by with your limited time and cognitive resources. The answer is to be mindful about what you learn, and notice that news sources with broad coverage (e.g., NPR, NYTimes, WSJ, etc.) tend to teach you more about the world at large, along with giving more depth about the arguments and specific detailed information.
The biggest surprise of the book is that so many people are SO convinced about their beliefs (religious, political, or economic), but they have little actual knowledge about them. In a bizzare upending of order, those that have the strongest beliefs on a topic tend to know less about that topic. The more then know, the more they admit there are multiple explanations and strategies. (For instance, people who believe strongly that the US should take military action in Syria are also the least likely to be able to find it on a map, or to say anything factual about the country. By contrast, people who can list the neighbors of Syria have a much more nuanced understanding of the issues about Syria.)
This is a book worth reading (although I wished he’d better more precise in describing the surveys he did).
A few bon mots…
p. 34. Describing Wegner and Ward’s work about looking up results on Google: can make you feel smarter about the topic than your tests actually show. (It boosts your self-assessment.) (See: Wegner and Ward, Scientific American, “How Google is Changing Your Brain” Dec, 2013. 58-61.
p. 196. “Reality is that which, when you stop believing in it, doesn’t go away.” Quote from Phillip K. Dick
p 221. “We are what we do with our attention.” Quote from John Ciardi.
P 228. Philip Tetlock – Experts often are no better at predicting future events accurately (study from 2005 on “Expert political judgement”
P 222. His chart on the correlation of knowing a broad range of “easy facts” and income. Interesting side note: knowing more “hard” facts did NOT strongly correlate!
P 264. In general, readers of broad news aggregators (e.g. Google News) scored LOWER than people who read a few news sources that have broad coverage (e.g., NYTimes). An aggregator does not substitute for a good general purpose newspaper.
P 273. Kahan’s melting ice cap question. “Climate scientists believe that if the North Pole ice cap melted as a result of human-caused global warming, global levels would rise—true or false?”
(The answer is that nothing would happen. But you need to know that the north polar ice cap is floating ice, and that melting floating ice makes no change in water level, as you see in your tumbler of ice melting in the midday sun.)
P 275. People tend to echo the beliefs of those around them (rather than determining them on their own).
P 277. “The issue is not just rational ignorance (remaining ignorant because the cost of acquiring knowledge outweighs the benefits) but something deeper. To form opinions on the scientific and technical issues driving public policy today—climate change, net neutrality, stem cell research, genetically modified organisms—it is not enough to [just] learn some facts. One must deliberate over those facts and actively seek out evidence that challenges what one wants to believe or initially suspects to be true. This is not something that many average citizens have the time or inclination to do. We fake our opinion, going along with the crowd. Kahan warns that: ‘…this style of reasoning is collectively disastrous: the more proficiently it is exercised by the citizens of a culturally diverse democratic society, the less likely that they are to converge on scientific evidence essential to protecting them from harm.’ “
P 281. Deliberative polling is a method to teach a group of people about a complex topic. First, you take a poll on the topic. THEN, you teach a class on that topic (with all perspectives represented, as unbiased as you can). This adds knowledge to the group and gives them time to deliberate on this. THEN you re-poll and look for changes. Expensive. See Fishkin and Luskin, “Experimenting with a Democratic Ideal: Deliberative Polling and Public Opinion” Acta Politica 40 (2005) 284-298.
P. 291. “those with context knowledge are better able to think for themselves.”
P. 295. “The one thing you can’t Google is what you ought to be looking up.”