3.5 stars
This is a very interesting, readable, and relevant pop psychology book, including some important insights and far more actionable information than such books usually offer. Some of its claims seem unsupported and some sections feel a bit tangential, and it is focused on the most contentious issues in current American politics, but overall it’s definitely worth a read for those interested in the subject.
The book explores the science and some human-interest stories around changing minds, and also three different methods that various groups have used to notable success. First, some notes about the theory:
- As those interested in the subject likely already know, the truth is tribal. The book explores this quite a bit: one study even found that people in an MRI machine exposed to challenges to deeply-held views impinging on group identity had their brains react the same way as to a physical threat! Our group membership really is that important to us.
- For that reason, people tend to be impervious to logical arguments countering their views (they also likely have other sources for their own arguments, which they trust more). People leaving hate groups or conspiracy theory communities don’t tend to do it because they’ve been talked out of their group’s beliefs. Instead, they feel less welcome in their group, or begin to feel welcomed in another, and a change of allegiance allows them to change their views.
- The book never really squares this with the fact that many people do in fact hold beliefs counter to their dominant social identification, sometimes even beliefs they wish they didn’t hold. And of course, once you move beyond the most hot-button, heated political and social issues, people convince each other of things with facts all the time.
- A useful model for how people evaluate arguments is the “central route” vs the “peripheral route.” If someone has reason to pay close attention, they use the central route, and critically evaluate the strength of arguments. If the issue isn’t that important to them, or they’re distracted, they use the peripheral route, where weak arguments, emotional appeals, etc., are more likely to succeed.
- A bit of countervailing evidence can actually strengthen our opinions, though there is a tipping point. In one study, participants who received 10-20% negative information about their candidate supported them even more strongly than those who received none. But those who received 40-80% negative changed their minds.
- People will form group identities around literally anything, and then prefer their in-group. Researchers get participants to do this in the lab all the time, with truly meaningless “identities” such as overestimators or underestimators of the number of dots on a page.
- An interesting argument about the purpose of opinions: because none of our personal opinions are likely to impact policy (let alone our actual lives), we primarily hold them as badges of social identity and belonging. The book tries to tie this to evolutionary psychology, suggesting that social identity is the only reason we have these opinions in the first place (which I don’t think really works—humans evolved to live in small bands where your opinion probably would matter), but it’s a helpful corrective to our usual assumptions.
- Another intriguing if questionable argument is that the reason we all have lots of cognitive biases (like confirmation bias, in which we only heed information that supports our beliefs), making it hard to evaluate our own arguments—but we do a great job of finding the holes in other people’s—is that this is in fact most efficient from the group’s perspective. In other words, we evolved to argue so that the group could arrive at the best decision.
Then, there are three methods that have had some success in changing people’s minds. Here’s the one discussed most in-depth, developed by the Los Angeles LGBT center to try to convince people on specific issues:
1) Ask if someone is interested in discussing the issue, and establish rapport.
2) Ask how strongly they feel about the issue on a scale of 1-10.
3) Share a story about someone affected by the issue (whether it’s you or a third party doesn’t seem to matter).
4) Ask for a number rating again.
5) Ask why the number feels right to them.
6) Repeat their reasoning back in their own words, ask if that sounds right, repeat until they are satisfied.
7) Ask if there was a time before they felt that way, and if so, how did they arrive at their current position?
8) Listen, summarize, repeat.
9) Briefly share your personal story about how you reached your position, without arguing.
10) Ask for a final rating and wrap up.
McRaney relates several success stories with this method, which seems to center around having a non-threatening, non-judgmental conversation about an issue, and catalyzing the other person’s exploration of their own reasoning. Early studies have been done on just how successful the method is, suggesting that it can appreciably affect the opinions of 10% of participants in just a 10-20 minute conversation (which in electoral terms, is huge).
There are definite sampling bias issues here, as those involved have consented to have the conversation in the first place (though often because vehemently opposed to the issue in question! Though interestingly, people who state their opinions vehemently then sometimes put themselves in the middle of that 1-10 range). The biggest issue with McRaney’s stories for me was that they all seem to be of people who have compelling personal reasons to change their views already (most often related to someone they care about who is personally affected), and just somehow seem to have not yet worked through that. It’s unclear whether the method only works on people in this situation, or whether McRaney just chose those stories because they seemed most compelling.
Here’s another method, called “street epistemology,” this one geared at getting people to explore their reasoning on factual claims:
1) Establish rapport, ask for consent to explore the person’s reasoning.
2) Ask them for a factual claim.
3) Repeat back in your own words until they’re satisfied with your summary.
4) Clarify their definitions, and use their definitions, not yours.
5) Ask for a numerical rating of their confidence in the claim.
6) Ask why they hold that level of confidence.
7) Ask what method they’ve used to judge the quality of their reasons, and focus the conversation on exploring their method.
8) Listen, summarize, repeat.
9) Wrap up.
I found the stories about this method even less convincing than the previous (and this one doesn’t seem to have been scientifically studied). It seems like a fun exercise for those who enjoy Socratic conversations, but even within the anecdotes cherry-picked for the book, no one actually changes their mind. McRaney tells a weird story in which he offers to demonstrate the method for a workshop participant, who proposes as a topic his (the participant’s) belief in God. The participant then shares an emotional story of why he decided to believe after struggling with doubt. McRaney promptly declares that proceeding with the exercise would take away the man’s faith and that would be wrong, at which point they quit and everybody hugs it out.
Honestly, it came across to me like McRaney just wanted to quit while he was ahead and cede the floor gracefully rather than making himself look like the bad guy. I was not at all convinced that either he or the method was nearly so powerful as he claimed. His stated reason for telling this story is that it’s important to examine why you want to convince someone of something, but he covers that far more effectively in a brief story about trying to talk his father out of a conspiracy theory. Lobbing arguments back and forth frustrated everyone, but when McRaney stopped to say “I love you and I’m worried you’re being misled,” they went on to have a productive conversation.
For completeness’s sake, here’s the third method, which has been tested primarily by political groups trying to change people’s attitudes (about vaccination, for instance):
1) Build rapport, ask for consent to explore the person’s reasoning.
2) Ask where the person is on the issue on a scale of 1-10.
3) If they’re at 1, ask: why would other people be higher on the scale? If above 1, ask: why not lower?
4) Summarize the person’s reasons in your own words until they’re satisfied that you’ve gotten it.
These methods all draw on therapeutic principles: people need to convince themselves, and they need a non-judgmental space to do it in. Arguing, hectoring and shaming won’t change someone’s views on hot-button issues—though it will change their view of you!
Definitely an interesting book overall and a worthwhile read (hence the sheer amount I’ve found helpful to write down), though some chapters feel more tangential than others. It’s rare among pop psych books in offering so much encouraging and practical information, which readers can put to use in daily life—though perhaps the biggest takeaway is that you can’t change someone’s mind without their consent, or without putting in some real work yourself.