The world is extremely polarized, and it’s been a topic of fascination for me for quite a while now. There’s a lot of finger-pointing going on as to what’s causing polarization, biases, and echo chambers. Social media is one of the primary culprits when it comes to their algorithms creating echo chambers of misinformation and biases, but mainstream media and politicians aren’t helping either. But something that I’ve been thinking about a lot lately is that maybe the self-deception that comes from believing you and your “side” is right might be the most rational thing that you can do.
I read a ton of books, and as of writing this, I’ve read over 370 non-fiction books in 2021. I love non-fiction because I love reading books by experts and journalists so I can learn as much as possible to quench my thirst for knowledge. When I get interested in a topic, I read a ton of books on the subject, and it’s extremely rare that one just slaps me in the face with something that shifts my perception of a topic. Most recently, this happened while I was reading The Bias That Divides Us: The Science and Politics of Myside Thinking by Keith Stanovich.
As you’ve probably guessed, the book is about the bias that comes along with “myside” thinking, and it dives into an insane amount of research that’s been done on the topic. At one point in the book, Stanovich argues that maybe myside bias is one of the most rational things that we can do. And as soon as he said that, I was like, “Alright. This dude has completely lost his mind.” But as he explains the logic behind his argument, I felt my brain explode inside of my head because it made so much sense.
The first question we have to ask ourselves is, “What is rationality?”
Most recently, Steven Pinker wrote a book called Rationality: What It Is, Why It Seems Scarce, Why It Matters, which was a great book, but I personally feel like it got way more attention than it deserves. I’m usually reading 5-10 books at a time, and I always ensure I have a book that reminds me about how flawed my thinking can be. So, when I read Pinker’s book, it didn’t bring anything new to the table, but it was written well and brought up a ton of debates about rationality. The problem is that we may be thinking about rationality all wrong.
I’ve been thinking about the topic of rationality even more lately because I’ve been going down the rabbit hole of Ayn Rand. I’ve heard a lot about her and her philosophy, but I wanted to dive into it for myself. My primary issue with her philosophy is that she seemed to have believed she has a monopoly on rationality, logic, and truth. The first book I read was The Virtue of Selfishness, and in this book, she starts out with the premise of what it means to be logical and rational. The entire time I was reading it, I was thinking, “Well, she’s wrong about a lot of this based on what science and evolutionary psychology have taught us.”
When you start from a bad premise in a non-fiction book, the rest of your arguments are built on extremely shaky ground.
For most of us, thinking “rationally” means eliminating emotions from the equation and looking at the facts. A lot of books discussing this relate it to thinking like Spock from Star Trek. Even though I’m not a Trekky, I know that Spock was the rational one and Kirk was the emotional one. Although I fancy myself a utilitarian, I see many of its flaws and know that at a certain point, emotions and the greater good need to be brought into the equation. This is especially true when you realize that our species would have never survived if we were as “rational” as Ayn Rand thinks we should be. We needed cooperation.
After reading some Ayn Rand, it made me want to revisit Jon Rauch’s latest book The Constitution of Knowledge: A Defense of Truth. In his new book, Rauch argues that truth and knowledge isn’t anything that any of us can claim that we have or own. Truth and knowledge is something we come to as a collective. On the surface, that sounds some like extreme post-modernist thinking, but it’s not. A great example he gives is that we may eat something one day, and the truth is that it tastes great, but when we’re sick, the truth is that it tastes bitter. Truth and knowledge shift based on situations, context, and new information. And this is why I think Ayn Rand needed a bit more humility with her premise.
In one of the first chapters of Rauch’s book, he reminded me of the research from Dan Kahan. For those unfamiliar, Kahan has done some of the best research when it comes to polarized thinking, and it’s referenced in just about any good book you can read on the topic of political divides. Something that Kahan researched is what he calls “identity protective cognition”. This is a sort of bias that we’re prone to that protects our sense of self and who we are. But our identity isn’t formed within a vacuum. Our identities are formed by our social surroundings.
Jonathan Haidt’s fantastic book The Righteous Mind was the first book that really helped me look at the larger picture. When we meet someone from the “other side”, we slap labels on them and make a ton of assumptions without really trying to do any perspective-taking. As someone who grew up on the West Coast in a non-religious family, my identity is going to be much different than someone who grew up in a super religious area of the South, and I have to remember that on a regular basis. We all have different experiences, upbringings, and social circles.
Although I’ve been curious about group polarization, I initially started by wanting to learn about crowd psychology and groupthink. This interest came after I was canceled on YouTube and saw thousands of people coming after me who neglected evidence and just went along with the group. What I ended up learning was that there was a very good reason that we evolved this way. Even though it seems completely irrational, it might not be.
For thousands of years, we lived in extremely small groups and relied on one another for survival. If you hoped to live and pass along your genes, you had to stick with the tribe, even if it meant believing in some silly stuff.
Think about it for a second. You’re living thousands of years ago, and your group believes in praying to various gods and doing all sorts of rituals. You rely on these people for food and protection. Do you really want to be the person who stands up and says, “Hey guys, I don’t think we have enough scientific evidence that these gods exist.” Everyone would start wondering if you’re really the right type of person to be in the tribe, and people may even start turning against you. That’d be terrible for you because if they kicked you out of the tribe, you’re out there fending for yourself.
So, conforming to the group in that context was the most rational thing that you could do. Even if you knew there was no evidence for the thing everyone believed, it was beneficial to play along. What would be even better was if you actually believed it.
What we also need to remember is that when it comes to rationality, reasoning is its close relative. In their book, The Enigma of Reason, Hugo Mercier and Dan Sperber argue that reasoning evolved so we could convince and persuade others. The best way to persuade others is if you actually believe that thing yourself. As a way to help persuade others of your belief, we evolved for self-deception. So, even if you didn’t believe what the tribe believed, it was in your best interest to lie to yourself until you did.
I think about this a lot as well because this is what makes it so hard to know a person’s intentions. Twitter, YouTube, mainstream media, and the government is filled with people who spout complete nonsense. Do they actually believe it or are they just doing it for personal gain? It’s hard to tell because they may have created a delusion that they actually believe. So, even if you hooked them up to a lie detector and asked them about the false information they spread, they’d pass with flying colors because they truly believe what they’re saying.
But let’s go back to the initial questions. Is self-deception rational? And what is rationality?
Well, that’s where Stanovich’s argument comes into play. Would it be rational to go against the group and possibly die? Probably not. Would it be more rational to self-deceive yourself and go along with the group so you can survive? It might be.
Stanovich frames it as looking at rationality as a simple risk analysis. As we now know, it’s far riskier to go against the group and lose your higher chances of survival. So, wouldn’t it be more rational to deceive yourself into believing what isn’t true so you can properly conform to the group?
The truth is that I personally don’t know, but it’s changed how I think about the topic when I look at polarization.
We no longer live in small tribes, but it’s still something to think about. If your entire family as well as your friends and neighbors are religious, they may abandon you if you come out as an atheist. Depending on their specific beliefs, even being gay could get you kicked out of the “group”. Then, you’re all on your own and better hope like hell that you find a new tribe. We’re social creatures and need human connection for psychological and physical reasons. Our bodies react in intense ways when we feel isolated or alone.
Now, think about the echo chambers of anti-vaxxers or the people who stormed the Capitol on January 6th. For a lot of them, the most rational thing to do was to deny the truth so they didn’t get kicked out of the group. When you try to present them with counter-evidence, identity protective cognition kicks in, and they get defensive or even aggressive.
Personally, I believe the truth matters above all else, and I’ve been fortunate enough to surround myself with people who care about the truth. But for others, they might not be so lucky. Lying to themselves and staying in their echo chambers might be the most rational option they have.
This is one of the reasons why I think it’s so important that we offer grace to anyone who leaves a group like QAnon rather than shaming them for the rest of their lives. It’s one of the riskiest things you can do to leave any type of group when they are your entire support system.
I don’t really know a great way to end this piece, but I wanted to get it out there to offer some better understanding of how we think about rationality and view those who are seemingly irrational. I’m not sure what the solution is, but there are people like Jon Rauch and many others who are a lot smarter than me who have some ideas. So, read some books. Learn about why we evolved this way. And then I think we can offer a bit more compassion and hopefully give people more incentives to respect truth and evidence and not fear being excommunicated from the group.
I’m currently writing a book about how we’re manipulated by the news, social media, technology, advertisers, and each other. It dives into the psychological history of manipulation, our biases, tribalism, and more.
To stay updated follow me on Twitter and Instagram @TheRewiredSoul and subscribe to the Substack.