You show someone clear evidence—documents, numbers, even video—yet instead of changing their mind, they double down. If you’ve ever had a political discussion, you’ve probably seen it happen. It can feel irrational. But what if it isn’t?
Beliefs are about belonging
We like to think we form opinions by weighing facts. In reality, especially in politics, beliefs are often tied to something deeper: belonging. Psychologists describe this as identity-protective cognition, the tendency to accept information that fits our group and reject what does not.
In practice, this means we are not just asking whether something is true. At the same time—often without noticing—we are also asking what it means for who we are and where we belong. Once a belief becomes part of that identity, changing it is no longer just an intellectual step; it can feel like a social one.
An ancient survival instinct
This pattern has deep roots. For most of human history, survival depended on staying within a group. Being excluded could mean losing protection, food, and support, so our brains became highly sensitive to anything that might threaten our place in that group.
That instinct still shapes how we respond to information today. When new evidence clashes with the views of our group, it does not feel like a small correction but more like a risk. Instead of calmly updating our beliefs, we tend to defend them—by questioning the source, dismissing the evidence, or shifting what we consider important.
Research shows something even more striking: people who are better informed or more skilled at reasoning are often more divided, not less. They use those abilities not only to understand the world, but also to defend the position of the group they identify with.
Loyalty over consistency
Seen from this perspective, something that often looks puzzling becomes easier to understand. In many populist movements, supporters defend positions that contradict earlier statements—or even their own previous views. From the outside, that appears inconsistent. From the inside, it can feel coherent.
The goal is not to remain consistent with past facts, but to remain aligned with the group. When contradictions arise, people adapt. They may deny the evidence, deflect attention, or quietly adjust what they think matters. In some cases, even personal standards shift in order to maintain loyalty.
What looks like changing beliefs is often something else: a consistent effort to protect identity.
Why more facts don’t solve it
It is tempting to think that the problem is simply a lack of information. If people knew more, they would change their minds. But the research suggests otherwise.
We are more likely to accept information that supports our identity and to resist information that threatens it. Corrections do not always help; sometimes they even reinforce existing views. In that sense, disagreement is not just about facts. It is about meaning, belonging, and status.
A different way of seeing disagreement
Once you see this, political arguments start to look different. They are not only clashes over truth, but also struggles over identity.
And if changing your mind feels like losing your place in the world, no amount of facts will be enough. Understanding that may not end the argument, but it does explain why these discussions rarely lead to agreement.
Further Reading
Dan M. Kahan, Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition
Dan M. Kahan et al., Culture and Identity-Protective Cognition
Flynn, Nyhan & Reifler, The Nature and Origins of Misperceptions
Jonathan Haidt, The Righteous Mind
Joshua Greene, Moral Tribes
