Jump to ratings and reviews
Rate this book

The Bias That Divides Us: The Science and Politics of Myside Thinking

Rate this book
Why we don't live in a post-truth society but rather a myside what science tells us about the bias that poisons our politics.

In The Bias That Divides Us , psychologist Keith Stanovich argues provocatively that we don't live in a post-truth society, as has been claimed, but rather a myside society. Our problem is not that we are unable to value and respect truth and facts, but that we are unable to agree on commonly accepted truth and facts. We believe that our side knows the truth. Post-truth? That describes the other side. The inevitable result is political polarization. Stanovich shows what science can tell us about myside how common it is, how to avoid it, and what purposes it serves.

Stanovich explains that although myside bias is ubiquitous, it is an outlier among cognitive biases. It is unpredictable. Intelligence does not inoculate against it, and myside bias in one domain is not a good indicator of bias shown in any other domain. Stanovich argues that because of its outlier status, myside bias creates a true blind spot among the cognitive elite--those who are high in intelligence, executive functioning, or other valued psychological dispositions. They may consider themselves unbiased and purely rational in their thinking, but in fact they are just as biased as everyone else. Stanovich investigates how this bias blind spot contributes to our current ideologically polarized politics, connecting it to another recent the decline of trust in university research as a disinterested arbiter.

256 pages, Hardcover

Published August 31, 2021

34 people are currently reading
752 people want to read

About the author

Keith E. Stanovich

24 books170 followers
Keith E. Stanovich is Emeritus Professor of Applied Psychology and Human Development at the University of Toronto and former Canada Research Chair of Applied Cognitive Science. He is the author of over 200 scientific articles and seven books. He received his BA degree in psychology from Ohio State University in 1973 and his PhD in psychology from the University of Michigan in 1977.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
37 (41%)
4 stars
30 (33%)
3 stars
16 (17%)
2 stars
4 (4%)
1 star
2 (2%)
Displaying 1 - 15 of 15 reviews
Profile Image for William Cooper.
Author 3 books312 followers
July 3, 2024
This is a really important book that deserves way more attention than it gets. Stanovich's myside bias is one of the primary drivers of the dysfunction in American politics today. Yet few are familiar with his thesis. 

Myside bias “occurs when people evaluate evidence, generate evidence, and test hypotheses in a manner biased toward their own prior opinions and attitudes.” 

This bias causes people to amplify what fits their preexisting beliefs and to diminish what doesn't. It's a powerful filter that distorts how people perceive reality. As a result, two people with opposing worldviews will see very different worlds—just like two musicians with different sheets of music will play two very different songs.

Of course, most people understand there's bias and partisanship in politics. But several essential things aren’t well understood. First,  myside bias is powerful and widespread—it's not just a few quirky misunderstandings at the margins. According to Harvard professor Steven Pinker—a big fan of Stanovich and of this book—myside bias is “probably the most powerful of all the cognitive biases.”

Nor does myside bias only impact one political persuasion and not the other. It fundamentally impacts how most people view politics. As Stanovich explains, liberals and conservatives alike “accept and reject science depending upon whether the conclusion aligns with the political policy that maps their ideological position.” 

Myside bias, moreover, is just as pervasive among smart and knowledgeable people as everyone else: “Research across a wide variety of myside bias paradigms has revealed a somewhat surprising finding regarding individual differences. The magnitude of the myside bias shows very little relation to intelligence." 

Yale professor Dan Kahan has actually shown that intelligent and informed people can be the most biased of all: “The capacities associated with science literacy can actually impede public recognition of the best available evidence and deepen pernicious forms of cultural polarization.”

Myside bias thus not only makes people, of all political stripes, think their group is good, but also—mistakenly—that the other group is bad. Or, as Pinker puts it, myside bias makes us think “that our own tribe is virtuous and wise and knowledgeable and the other tribe is evil and stupid and ignorant.”

One of the primary causes of our political dysfunction is myside bias. It degrades our discourse and deepens our divide. We'd all be better off if a lot more people read this book. 
Profile Image for Barry.
1,223 reviews57 followers
November 4, 2025
This is one of the best books I’ve read this year and it deserves to be more widely read and appreciated. Its findings are important and have significant implications in our polarized political climate.

Stanovich reviews and explains the psychological research about the puzzling phenomenon known as myside bias. Myside bias is similar to, but distinct from confirmation bias. Stanovich clarifies that confirmation bias is the “bias toward looking for positive tests of the hypotheses that are focal in our minds.” Belief bias is “the bias that occurs when we have difficulty evaluating conclusions that conflict with what we know about the world.” Myside bias is “the bias that occurs when we evaluate evidence, generate evidence, and test hypotheses in a manner favorable toward our prior opinions and attitudes,” our convictions, worldviews, or “distal beliefs” (as opposed to testable beliefs which can be verified by observation).

An everyday example of myside bias is when football fans disagree over a penalty for unsportsmanlike conduct. Fans of one team interpret the behavior as clearly worthy of a penalty, while the fans of the other team view it as simple exuberance. What we strongly believe significantly affects how we interpret the data.

One of the interesting features of myside bias is that it is an outlier in at least three ways. First, unlike other types of cognitive biases it is not predictable from standard measures of cognitive and behavioral functioning. People with higher IQs are more likely to avoid other types of thinking biases, but are just as likely to suffer from myside bias as anyone else. Second, unlike other types of cognitive errors, myside bias may not always be irrational and may sometimes help us as individuals, but is likely damaging at the societal level. Third, myside bias in one domain is not a reliable predictor of bias in another. Instead, the degree of myside bias is correlated with the strength of our convictions about a specific issue.

Matt wrote an excellent summary and review of the book here:

https://www.goodreads.com/review/show...

Since Matt already did the work of summarizing Stanovich’s findings and theses I won’t attempt to recapitulate that here, but I recommend reading his review before finishing mine, and then I’ll just add a couple thoughts and some quotes.
No really. I’ll wait.

Stanovich also points out that we tend to think that we’ve arrived at our beliefs through a process of rational decision-making, but this is often not actually true. Many of of our beliefs are inherited or adopted because they simply fit our temperaments. And some beliefs act like memes (as described by Dawkins) that replicate like viruses.

In another chapter he discusses how psychologists have created studies which purportedly indicate that conservatives are more prejudiced or have other sorts of cognitive deficiencies, but Stanovich shows how these studies are actually demonstrations of myside bias in action—they were designed to yield the results that their authors already believed must be true. Subsequent studies have debunked all of these findings.

The final chapter discusses how we might ameliorate some of the harmful effects of myside bias. It’s important to understand that arguing about “facts” will not persuade those with different political views. The issue is not believing different facts but holding different values. Or more accurately, a different hierarchy of values.


Here are some passages I highlighted:

‘Lilliana Mason (2018a) found that the degree of affective polarization between groups of political partisans was driven much more by group identity than by issue-based ideology. Statistically, partisan identification was a much stronger predictor than actual differences on specific issues…Mason concluded: "The power behind the labels liberal' and 'conservative' to predict strong preferences for the ideological in-group is based largely in the social identification with those groups, not in the organization of attitudes associated with the labels."’
[p 47]


“Carney and Enos (2019) further concluded that these modern racism scales captured not racial resentment specifically connected to conservatives, but instead racial sympathy specifically connected to liberals (see also al Gharbi 2018; Edsall 2018; Goldberg 2019; Uhlmann et al. 2009). For conservative subjects, the so-called modern racism scales served to measure not their racism but instead their belief in the relative fairness of current society in rewarding effort; and, as such, these scales have been mislabeled from their very inception. For liberal subjects, the scales did serve to measure something specifically directed at blacks, but that something, if anything, was liberal subjects' tendency to display a special affinity toward African Americans—or perhaps their awareness that African Americans were the target group with the highest payoff in terms of virtue signaling.
More recent studies have shown that psychological relationships involving prejudice and tolerance are contingent, depending on the congruency between the values of the subjects and the values of the target groups in the stimuli used in the study.”
[p 79]



‘In short, my correspondents seemed to be engaging in what political scientist Arthur Lupia (2016, 116) calls the "error of transforming value differences into ignorance"—that is, mistaking a dispute about legitimate differences in the weighting of the values relevant to an issue for one in which your opponents "just don't know the facts." Years ago, in an elegant essay, Dan Kahan (2003) argued that this was exactly what had happened in the debate over gun control, which had been characterized by the "tyranny of econometrics" —arguments centered on "what studies show" in terms of lives saved by the use of weapons in self-defense, deterrence of crime, and the risk factor of having a gun in the home. Kahan (2003) further argued that the central issue would never be resolved nor would compromises ever be reached by mutual agreement on what the facts were because at the center of the debate was culture: what kind of society Americans wanted to have. Gun control proponents were swayed by the value they put on nonaggression and mutual safety enforced by the government. Gun control opponents were swayed by the value they put on individual self-sufficiency and the individual right of self-defense. These values tracked urban versus rural demographics (among others), and the weighting of these values by a particular individual in either of the opposing groups was not going to be much affected by evidence, if at all. Kahan (2003) recommended instead a more open and unconstrained discussion of cultural differences. He advocated for a more expressive debate in which people would feel free to articulate their values and cultural differences. But, after noting that "the hope that the gun control debate can be made less contentious by confining it to empirical arguments is in fact an idle one," Kahan (2003, 10) warned that "the unwillingness of most academics, politicians, and ordinary citizens to engage in a frank airing of their cultural differences ultimately deepens the acrimonious quality of the gun debate." I think that a combination of strategic advantage and myside bias leads cognitive elites to resist the expressive debates about culture that Kahan (2003) recommended. With a view that clearly reveals their myside bias, they believe that if a dispute can be resolved by reasoning about facts, then they will always win because they are the experts on facts and reasoning.’
[p 127]


‘The way to get liberals to reason with less myside bias about the priority given to climate change initiatives is to remind them that they also care about the poor people who will be the first victims of any slowing in economic growth. Likewise, the way to get conservatives to reason with less myside bias about the priority given to climate change is to remind them that they also care about having a livable world for their children and grandchildren.’
[p 131]


“The issue-by-issue inconsistency of most people suggests a way we might alleviate myside bias fueled by partisan polarization. My recommendation would be to follow the lead of the nonelites. Go ahead and be "inconsistent" from your party's point of view. Stop trying to be consistent because many of the issues that are bundled together to define our political parties are simply not linked by any consistent principle. They have instead been bundled for political expediency by partisan elites on both sides.
“If major political issues really should not cohere very strongly with one another, then maybe those of us who wish to tame our own tribe-based myside bias (Clark and Winegard 2020) should be skeptical about taking a position on an issue we know nothing about just because that's the position our party takes.”
[p 142]


I asked ChatGPT for a summary and critique of the book and once again it makes me wonder why I even bothered typing all this out. But here it is:

https://chatgpt.com/s/t_68c77960c21c8...
Profile Image for Matt Berkowitz.
92 reviews63 followers
January 4, 2024
This is easily one of the best books of the year I’ve read so far. Stanovich is an incredibly clear writer and leading cognitive psychologist studying “rationality”. This book is about “myside bias”—known in popular parlance as confirmation bias—and how myside bias is an outlier bias. The ability to avoid most other cognitive biases correlates moderately with intelligence. For example, the ability to avoid the overconfidence bias is correlated 0.38 with IQ (Stanovich, West & Toplak, 2016). But myside bias has either zero or sometimes a negative correlation with IQ.

So, even the smartest among us still fall victim to myside bias to about the same degree as everyone else. Why is this? Stanovich explains this from multiple angles. One is that myside bias is less likely to affect our “testable beliefs” (beliefs that can more easily be checked against an empirical reality) and more likely to affect our “distal beliefs” (our convictions based on our worldview and deeper values, e.g., our stances on immigration, abortion, or sex/gender). These distal beliefs are more often a function of our tribal identities—something “cognitive elites” are no less prone to having—than they are a function of seriously thought through positions.

Likewise, myside bias is less a function of psychological characteristics and is more “domain specific”; that is, personality and cognitive characteristics of a person are not predictive of myside bias, but specific belief content is. The stronger we believe in a proposition (especially of the distal belief variety), the more likely we are to exhibit myside bias in evaluating it. It’s the strength of belief that predicts myside bias, not the direction of belief. So, being moderately pro-life or pro-choice may not induce myside bias, but being a diehard believer in either stance is much more likely to.

There is also essentially no partisan difference in the tendency to fall victim to myside bias. Liberals and conservatives (in the US) exhibit similar degrees of myside bias for their core beliefs—sometimes on different subjects. Both sides deny science when it suits their tribe, but they tend to deny the science on different subjects, e.g., conservatives deny climate change, evolution, and Covid vaccine safety & efficacy more readily; liberals deny the heritability of IQ, the safety of nuclear power, and sex differences more readily.

Stanovich finishes with a lengthy chapter on what we can do to avoid myside bias—some of which include:

1. Realize that we often differ from our political opponents not because they’re ignorant and we’re not, but because of legitimate underlying core value differences

2. Realize that we, ourselves, hold conflicting values. For example, liberals are more concerned about climate change, but their solutions often involve curtailing economic growth; however, economic growth is vital to alleviating poverty, another core liberal value. These two values are thus in conflict. In contrast, conservatives are more in favour of free market capitalism, yet also are more likely to want to preserve traditional family structures; yet, there is perhaps no more disruptive force to traditional social structures than unregulated capitalism. These two values are thus in conflict.

3. Recognize that many of our distal beliefs are, in part, a function of our genetic predispositions and social milieu as opposed to having rigorously thought through our positions.

4. Realize that political tribalism is contributing much more to myside bias than the actual political issues are. Work on decoupling your specific beliefs from party affiliation.

5. Oppose identity politics (the “common-enemy” kind), which magnifies myside bias. Stanovich discusses in detail how identity politics—the preoccupation with interpreting issues through the lens of the historically “oppressed” vs “oppressors", and a general obsession with immutable traits (race, gender, sexual orientation, etc.)—distracts from our ability to dispassionately evaluate arguments and evidence (i.e., the scientific worldview). Many universities have largely abandoned their commitment to the scientific worldview in favour of an identitarian perspective that eschews speech contrary to the increasingly leftist monoculture affecting parts of the university (e.g., the pervasiveness of “diversity equity inclusion” committees/statements; the increase in deplatforming from the left; and the increasingly intolerant “social justice” activism from various student bodies).

I can’t say enough good things about this book. If you’re interested in rationality, how to think better, and how these issues have influenced our polarized society, it’d be hard to find a more important book. My utmost highest recommendation.
Profile Image for Brandon.
11 reviews
September 2, 2021
The great pro: This is an important topic.  If we're going to solve any of the problems we face as a society, or even the more personal problems we face as individuals, understanding myside bias is an absolutely necessary step.  Stanovich does an excellent job describing why myside bias is unique among cognitive biases, in its prevalence and in its impacts.  

The great con:  Stanovich goes off the rails when he gets to the section on identity politics.  He can't define it but he knows he hates it.  The irony of this professor, in a rant about people delusionally fantasizing that they are victims, concluding (based on a handful of anecdotes including *gasp* his personal experiences) that the real problem here is the *censorship of professors* was too much for me.  

I could go on, but I think it's simpler to provide quotes from his book and alternative books to read.

Stanovich: 
"By this reasoning, your arguments count for more if you have earned a higher medal in the victim Olympics."

"In our universities, identity ideas have been jumbled together with various strains of Marxism..."

"Back in the 1970s, I exclaimed 'science doesn't care about your feelings.' If I were to declare that in a classroom now, it would lead to a visit from the 'bias response team'."

Other books:  The Scout Mindset, Uncivil Agreement, and The Truth about Denial.  All of these other books explain the same human tendency to seek, evaluate, and remember new information in a way that re-confirms our prior beliefs.  They all point out that liberals are as irrational as conservatives.  They even all explain that hardened "identities" can harden convictions rather than leave them open to new evidence.  Yet none of these other books cherrypick data on Black men's encounters with police or attack strawmen versions of the gender pay gap.
Profile Image for Moh. Nasiri.
334 reviews108 followers
October 7, 2021
"سوگیری که ما را چند دسته می سازد:

دانش و سیاست تفکر طرف من" آخرین اثر کیث استانوویچ استاد روانشناسی دانشگاه تورنتو است که توسط انتشارات ام آی تی به چاپ رسیده است. متاسفانه استانوویچ برای مخاطب ایرانی و حتی جهانی کمتر شناخته شده است. او کارهای بسیار جالبی در زمینه خردگرایی انسان ها انجام داده است و ضریب خردگرایی از ابداعات اوست. او در آثار دیگر خود مدعی است که در انسانها علاوه بر ضریب هوش، توانایی خاصی در تفکر خردگرایانه وجود دارد که همبستگی کمی با هوش متعارف نشان می دهد اما سرمنشا قدرت نقد و استدلال و نپذیرفتن خرافات است. در کتاب فعلی به یکی از جالب ترین سوگیری های بنیادین بشر بنام سوگیری طرف من می پردازد. در این سوگیری انسان ها بشدت شواهد بیرونی، اتفاقات و گزاره ها را به نوعی درک و تفسیر می کنند که موید اعتقاد قبلی آنها باشند. عده ایی به این سوگیری تایید هم گفته اند اما استانوویچ به تفاوت اساسی بین خطای تایید و سوگیری طرف من اشاره دارد. کتاب مملو است از اشاره به آزمایش های مهم در علوم شناختی در زمینه شناخت انگیزه مند و جهت دار که به یکی از آنها در فایل قبلی بنام "شناخت های جهت دار ما" اشاره کردم. او معتقد است که این سوگیری با زمینه های طبقاتی و اجتماعی افراد مرتبط است و باور دارد که در همه جناح ها چه محافظه کار و چه لیبرال با شدتی کمابیش مشابه دیده می شود. نکته جالب دیگری که به آن اشاره می کند این است که وجود سوگیری طرف من از شخصیت افراد بر نمی خیزد و عاملی فراگیر نیست. همان انسان هایی که در حوزه های خاصی سوگیری شدید طرف من دارند در حوزه های دیگر کاملاً طبیعی عمل می کنند. بعبارت دیگر این خطا نه محصول صفت افراد، بلکه خاص سوژه مورد مناقشه است. محافظه کاران گاهی هنگام بحث درباره سقط جنین، آزادی حمل اسلحه و حقوق همجنسگرایان این کژ کاری شناختی را نشان می دهند و لیبرال ها هنگام بحث درباره حقوق اقلیت ها، برابری جنسیتی و یا تعرضات نژادی ممکن است دچار آن شوند. او معتقد است که اصولاً مغز برای ارزیابی بیطرف شواهد تکامل نیافته است و پردازش جهت دار شکل غالب در کارکرد است که البته این جهت داری ارزش تکاملی و بقا داشته است. یکی از این ارزش ها همسویی و هم جهتی با افراد هم فکر و هم گروه و تقویت اتصال بین آنها است. یعنی بواقع سیستم شناختی ما در مقابل بسیاری شواهد نابینا می شود تا در ازای آن همبستگی و انسجام گروهی حفظ گردد.
لینک شرح صوتی کتاب:

https://www.instagram.com/tv/CUsqw4Fo...

#Azarakhsh_mokri
Profile Image for Erika.
446 reviews22 followers
July 4, 2024
Many thoughts on this book.

First thing, despite not always following the various social science jargon, the first four chapters or so were a compelling read. Admittedly, I found them compelling because they confirmed my own myside bias, which was that basically everyone with whom I have social media encounters (in my case, largely liberals/leftist, even though I too consider myself a liberal/leftist) has a huge and frustratingly unacknowledged myside bias. I've noticed this over the years as commentators will soundly critique the arguments of the "other side" for their irrationality, but then seamlessly engage in the same kind of diversionary tactics, obfuscations, misrepresentations, straw men, cherry-picking etc when it serves their cause. It isn't merely that everyone I knew seems to do this regularly - and I'm sure I'm guilty of this as well - it is also that basically no one seems to think it is a problem. This means that for someone like me, who, I will give myself credit here, *does* tend to be hyperaware of what the other side might argue, finds themselves constantly acknowledging weak points or refusing to use the same "my team rah rah" strategy that interlocutors seamlessly employ, finds themselves forever unable to properly present their ideas in the way that ideas these days are, I guess, supposed to be presented.. which is not as a series of good faith questions but as a goddamn football game. And I have always hated sports.

I permanently deleted my account after I posted something saying something to the effect that most Americans probably wouldn't mind a dictatorship provided that the economy did okay and they agreed with the dictator. I said this after a poll showing nostalgia for the Trump years, but one of my very lefty friends jumped in to say that she wouldn't mind giving up civil liberties if it meant a government that ensured full employment, education and housing. She proceeded to say that "freedom of speech = the freedom to starve" and suggested she was in favor of a one-party state. Aside from proving my initial cynicism all too well (and proving it wasn't a right vs left thing, as I quickly added in response to her comment), I spent several days reeling from this. Despite ideological and other conflicts in the past that had taught me that she had very little/no ability to see anyone else's point of view and, moreover, didn't even think there was any value to so doing, I thought she was smart and what she was saying, which sounded like bringing back the Soviet Union, just sounded so ludicrous. But such is my brain that I was able to think myself into a hole of wondering if my entire support for liberal democracy such as it is (which is more or less along Churchill's lines of "the worst government except all the others") wasn't just a product of intellectual laziness/cowardice and an awareness that, had I been raised in another political system, I'd probably believe in its superiority as well (not that this individual WAS raised in such a system, but apparently she'd come to think it better). But it wasn't the nagging feeling that maybe I just lacked imagination when I supported freedom of speech and human rights that really got me. It was feeling like I "should" be the kind of correctly put together ideological person who didn't face dropping into an epistemological loop and recognizing the inate rationality of my beliefs everytime something like this happened. So I decided I just wasn't going to expose myself anymore to a place where discussions weren't discussions - they were supposed to be wars.

So on that level, this book made a lot of sense and felt like a relief. Yes, this was a thing, a psychological fact, an erroneous way of reasoning. And I've noticed over the years that it the very smartest people that do this the most. I first picked up on this when I worked as a political opinion poller in college. I was struck by how few people actually understood our questions in the way that they were "supposed" to be understood - i.e. by extremely ideologically "literate" people - and thus how many people's political ideas just seemed so random. Undoubtedly they've become more coherent over the past two decades as information has diffused through the internet and our political landscape has become more polarized, but what I did notice back then was that it was the seriously politically involved who had the most pre-packaged beliefs and the least understanding of others. So there too it was nice to have my own experience supported. I also found the whole memetic theory of ideas fascinating, and this book made me want to read more about how ideological convictions are formed.

That said, I'm docking it a star because of the unnecessary length - and sometimes almost deliberate obtuseness - of the polemical parts of Chapters 5 & 6. The author's understandable annoyance with what he (rightfully IMO) terms the political monoculture of academia (minus economics & business) lead him to get carried away with some distracting screeds that are sometimes difficult to support. For instance, at one point he claims that the Democratic Party's positions against charter schools and in favor of student loan forgiveness are about "political coalitions rather than principle" since they supposedly "violate" the Democratic principles of being for the poor and the minorities. However, one could just as well argue that the Democrats believe that education should, in theory, be a public good free to all. See, no principle violated there. Anyway, there was really no need to carry on about all this, although I've said for years that I think it's troubling how many fields of academia have so few conservative voices (and, obviously, have been roundly reproved for saying this).

All in all, interesting book, and I will be reading more in this area.
95 reviews4 followers
January 7, 2024
Challenging Assumptions: Unveiling Counterintuitive Insights in Politics and Cognition

Overall Rating: 4.75/5.0

The Bias That Divides Us is an excellent book for anyone interested in politics or psychology, but especially for those who love the counterintuitive. In the book, Keith Stanovich discusses Myside Bias (MSB), an outlier among the well over one hundred cognitive biases humans are known to have. It is an outlier since the vast majority of cognitive biases show a negative correlation with general intelligence, while MSB does not. MSB further fails to show a correlation with thinking dispositions. This is even true of open-mindedness, which, by its very definition, one would think would be certain to be negatively correlated with MSB.

Stanovich's book is, thus, loaded with conclusions that will surprise many. At the same time, however, it also presents research confirming things that many who have discussed politics, philosophy, and religion with others will have suspected all along while not being aware of recent empirical research backing their conclusions.


Overview of Contents

Stanovich begins the book by describing what MSB is and how, although it has been assumed to be confirmation bias in the past, it is actually something different. This is because MSB deals with distal matters, i.e., ideological beliefs, especially in areas regarding morality that cannot be confirmed or falsified experimentally. Many have noticed that beliefs of this sort are highly resistant to change through rational discussion or otherwise. Stanovich presents research showing that people do not generally update their confidence in beliefs on distal matters optimally according to Bayesian modeling. Indeed, often, the opposite happens: they become more entrenched in beliefs even when contradicted by the evidence.

Although it has been said that every equation a book includes cuts its readership in half, Stanovich does present the math. It should, however, be understandable by anyone who has taken an introductory course in probability. Even if not, the key theorem is ultimately explained clearly in terms as simple as multiplying fractions.

Stanovich explains that an important point is that MSB does not necessarily mean that one has a strong belief on an ideological issue, but, instead, that one fails to update it appropriately in light of new evidenceor, relatedly, is not able to state arguments and evidence that run counter to their position. As a particularly interesting example, Stanovich presents research showing that people's reasoning on gun control and immigration by Muslims tends to demonstrate inconsistencies since, in Bayesian terms, some key questions seem very similar.

Having shown that MSB seems non-normative or unreasonable, Stanovich then looks deeper at the research, questioning this conclusion. He shows that MSB may not be as irrational as it seems. A key result here is that if one's prior beliefs are mostly correct, letting them influence whether the new evidence is credible is correct. If, however, most of a person's beliefs are wrong, this merely gets them further entrenched in error. Stanovich then questions the very nature of the Bayesian model itself. Is it correct to interpret the search for truth as the only thing one should optimize for? What if this will come at a social cost? For example, what if it would cause a break in your close relationships and community? What if finding the truth came at such a high cost that you did not think it was worth paying for the potential gains? Stanovich concludes there is no right way to weigh these considerations. Naturally, however, most people will maintain that they are not letting these considerations interfere with their quest for truth when they, indeed, perhaps unconsciously or semiconsciously, are.

Stanovich next goes on to describe what many will find counterintuitive. General intelligence and even thinking styles, for example, need to reason, which would seem likely to reduce MSB, do not, in fact, do so. Indeed, Stanovich presents fascinating results showing that the only thing that seems to predict MSB is the strength of belief going in. This is sometimes in conjunction with whether one is liberal or conservative leaning. Surprisingly, for about 80% of the population, MSB on one issue is not correlated with MSB on others.

Stanovich's discussion of increased political polarization is particularly relevant regarding what many have noted about recent political trends. The results here are likely to surprise most: Most of the population (about 80%) is not getting more extreme in their belief. The 20%, however, who are are what Stanovich calls the cognitive elite, meaning scoring high on general intelligence. Stanovich explains that what seems to be happening is not that people are aligning more with their side's positions. Indeed, Stanovich notes there is little consistency across issues, but instead, people are expressing their disdain for the other side. Stanovich conjectures, and there is recent research on the effects of propaganda to back this up, that the cognitive elites have a greater need for consistency of ideas and ideologies tying it all together than most. This, however, negates what could be expected to be their advantage in avoiding MSB.


Stanovich next goes on to elaborate on how MSB has led the very people we would expect to be least prone to MSB, namely psychologists studying cognitive biases, to be highly prone to it. Stanovich, for example, shows a long history of psychologists, a very liberal-leaning field, to have obvious methodological flaws in their research, leading to conclusions that conservatives suffer from intellectual or personality flaws as compared to liberals. An example of particularly egregious methodological flaws is defining conservative beliefs to be authoritarian, then doing surveys that show that conservatives tend to have conservative and, hence, only because of the researcher's definition, authoritarian beliefs.

The book ends with a general and seemingly well-deserved tirade against wokeism in academia. What good are universities, the best places to prepare students to avoid biases, if they do the opposite: encouraging them to think in obviously biased ways?

Strengths and Weaknesses

The book does a good job of explaining its subject matter to non-professionals. The examples chosen are easy to follow and applicable to the current political situation. Stanovich has dug into to explain when MSB is not necessarily a problem even though, at first, it might seem the answer should be "always." In a book such as this, where there is some but not insurmountable complexity to the subject matter, there is always a tradeoff between stating things multiple ways so that the reader will get it and becoming too repetitive. Stanovich has done a respectable job of finding this balance.


There are only a couple of real weaknesses. The first is that, in some places, not all concepts are explained in the text. Thus, if you do not know what a Spearman manifold is going in, you will have to consult a search engine or chatbot. The second weakness is that Stanovich does not get into many alternative conjectures about why MSB might exist; instead, he focuses on his favored explanation: memetics.

Readers who lean more toward the political left may think Stanovich has a conservative bias. This is likely true given the last chapter's criticism of wokeism and academia. Considering that much psychology literature is left-leaning, however, and Stanovich does a good job showing how this has harmed objectivity, I do not consider it a significant flaw that the last chapter presents more of the conservative perspective. Even given this, most of the book does a good job presenting examples of MSB implicating both conservatives and liberals.

The ultimate conclusion is that both sides are equally prone to MSB, may have genetic predispositions leading them to one ideology over the other, are equally intelligent, and tend to occasionally have slightly different thinking styles. The latter, however, is in a way that balances equally between conservatives having the more desirable ones and liberals having the more desirable ones.

Conclusion

The Bias That Divides Us is one of my two favorite books I read last year (the other being Why We Sleep: Unlocking the Power of Sleep and Dreams). It presents counterintuitive results regarding who is most prone to bias in the political arena and provides evidence confirming many things most will have already suspected. It does a good job of rescuing conservatives from the biased, incorrect beliefs academics have spread about them, along with pointing to research disputes on many dogmatic beliefs of the left. For example, are women really discriminated against when it comes to hiring and promotion in STEM fields?

Most importantly, Stanovich provides advice on how to limit MSB personally. For example, when you encounter new evidence about a position based on an untestable ideological belief, assume that the direction the conclusion leads to has a 50/50 chance of being right, then evaluate the evidence on its own merit without bringing in prior beliefs to influence how credible you find it. You can then update your beliefs regarding the probability you were accordingly.

Overall, the book is a must-read for anyone interested in the psychology behind how we process our ideological beliefs.
Profile Image for Hemen Kalita.
160 reviews19 followers
December 12, 2021
Excellent. The book explores in depth our partisan biases. It is heavily cited. Stanovich, the author, is an authoritative figure in the field of psychology and he himself has carried out a number of studies mentioned in the book.
4 reviews
February 7, 2022
I just finished reading the bias that divides us the science and politics of myside thinking. It is a rather academic book that is filled with careful clear arguments. It is now my goto source for dealing with not only political beliefs but other value laden beliefs. One of his major thesis is that we as a community have to separate our empirical beliefs from our values. For example it is matter science whether women are paid less than men on average including the reasons why. The question what to do about it is an instrumental rationality question i.e. what are your goals based on your values and what trade offs are you willing to make to achieve those goals ( my example not his). He goes through the psychology of protected values aka sacred values which is so helpful. There are a lot of footnotes in this short book as well as 54 pages of references at the end. I highly recommend though it will be tough going if you are not used to this type of prose. I would add Stanovich is not an ideolgoical person and reading him confirms my decision to become politically independent in order to decouple my beliefs from tribalism. There is no logical connection between many political beliefs so that it is better in my view to evaluate each on on its own merits relative to your values.
Profile Image for John Crippen.
553 reviews2 followers
January 17, 2022
Professor Stanovich describes myside bias (I know, what a terrible name) in detail and proves how it is an outlier compared to the vast majority of other cognitive biases due to the fact that the smarter you are, the more likely you are to suffer from it. Then in the last two chapters of this short book he covers how the myside bias of the cognitive elite is ruining our political discourse and destroying the credibility of our universities. Most of the terrible problems covered in the later chapters are exacerbated by the rise of identity politics, which itself is enabled/caused/supported by myside bias. Lots to unpack. I hope Professor Stanovich, author of one the top 10 books in my life so far (The Robot's Rebellion: Finding Meaning in the Age of Darwin), is not "cancelled" because of this interesting and hard-hitting book.
Profile Image for Chris Boutté.
Author 8 books278 followers
December 22, 2021
This is by far one of the best books I’ve read about polarization and “myside thinking”. Stanovich draws on some of the best research I’ve seen about how we’re affected by this form of tribal thinking. Aside from the psychological studies he cites to explain why we’re blinded by tribalism, he also brings up some grat philosophical topics. I read a ton of books, and it’s rare that one really gives me something to think deeply about, but this one was able to do just that. At one point in the book, Stanovich argues that myside thinking may actually be the most rational thing a person can do based on how we’ve evolved as social creatures. This book gives you a ton to think about while also explaining some of the polarization that’s tearing us apart. I highly recommend everyone check this book out.
Profile Image for Vincent Tijms.
47 reviews5 followers
Read
July 26, 2022
This book offers a nuanced overview of research into myside bias and it brings insights into how we might minimize myside bias when it's not normatively epistemically rational.

And then the book sets off into a diatribe about identity politics infesting and harming academia. It's not completely off-topic and to be fair it's also one of the better critiques I have seen, but it leans heavily on Pluckrose-and-Lindsay style alarmism and uncharitable interpretations, which makes it a rather "mysided" ending to an otherwise balanced book.
Profile Image for Jack Kubinec.
21 reviews2 followers
April 17, 2023
The best-articulated case I’ve read for why universities are uniformly liberal and why large segments of the public have ceased to trust them.
Profile Image for Sadjad Esfeden.
29 reviews1 follower
December 31, 2024
good review of different biases and mysids bias, rational thinking, confirmation bias, etc. I think I need to re-read this once in a while!
Profile Image for Dimitris.
32 reviews5 followers
August 20, 2025
Great book but at the final chapter Stanovic falls victim of his anti-marxist myside bias and this is so very obvious from his tone and vocabulary. Some of his arguments are valid though
Displaying 1 - 15 of 15 reviews

Can't find what you're looking for?

Get help and learn more about the design.