You’re probably in an echo chamber
A casual discussion of fear, disagreement and Bayesianism
Like moths to a flame, where online discourse goes, the mention of group disparities follow, be it political, social, economic. The narratives of today promote a certain divisiveness among its consumers, and I find that it has an annoyingly reductive way of plucking false dichotomies from thin air and polarising all concerns. Because extremes are lurid and sensationalism invigorates the mind, we’re steered towards being forced to take sides on issues that are not necessarily mutually exclusive, like containing the pandemic versus economic wellbeing, globalism versus nationalism, science versus religion...
Unfortunately, most issues of human contention, particularly those that speak to emotion, are not as black-and-white (or red and blue?) and simplistic as that. But thanks to algorithms, virality, and emotionally-charged rhetoric, it’s easy to start thinking that way, and before you know it, wham! Herd mentality. Obviously we feel more comfortable around people who share our opinions and reinforce what we know to be the correct answer. When inundated with the noise of a billion differing views, the inclination towards peaceful unification and consensus only grows, which warps our perceptions of the world around us.
The concept of the echo chamber has been around for the last five or so years now, and has already been flogged by tons of journalists (see: this Forbes article for a better explanation on why it is so very endemic). I’m pretty sure it’s taught in schools, too. But here’s the thing: you’re not free from it just because you know it exists. In fact, being educated makes you more likely to be complacent and fall prey. If you don’t think you’re in one, you probably are.
Three supporting arguments in PEEL format, and one counter-argument plus rebuttal. Sound familiar? We are indeed taught to identify opposition and recognise biases that differ from our own (albeit in an ineffective, formulaic way). I do think most of us will acknowledge that there exist other people we do not always agree with — for extremes, think of the boomer auntie sending articles on miracle cures for cancer in your family’s WhatsApp group chat, or maybe the SJW friend who adds pronouns to the back of their name every time they write it — yet still try to get along with them nonetheless. But tolerance is far from sufficient.
Merely knowing that there are two sides to every issue does not solve the problem of the echo chamber. In fact, I would even suggest that an indifference towards and avoidance of those disagreeable aspects (for example, “No point explaining to them, they’ll never change their mind”) is a symptom of you acting in your own echo chamber.
It’s subtle, but sometimes the very knowledge that the other party is in a different camp from ours is enough to provoke us into not giving them the time of day that we would typically give others, even more so if it’s an issue we care strongly about. Avoiding discussion with or unfollowing people you disagree with hurts thinking further, because the lessened interaction means you are perceiving them based on your knowledge or feeling that their beliefs are incorrect or inferior (again, merely based on what you know).
Something to chew on: if you genuinely do believe your beliefs are robust and gospel, wouldn’t it make sense to show that they cannot be proven false? Here, I’m borrowing the philosopher Sir Karl Popper’s theory of falsification, which essentially states that we should attempt to disprove theories rather than prove it in order to progress towards the truth.
The Problem II: Your Feelings
Perhaps your rational brain already knows that echo chambers are bad and there’s a need to confront it, but your monkey brain doesn’t want to emotionally internalise it (I’m sure we’ve all been there). Falsification is a great mental barrier. After all, walking around with the presumption that your beliefs risk being turned to dust is not a nice feeling. Why go out in search of disappointment when it’s much more satisfying to find evidence that proves your intuition correct (confirmation bias)?
Here’s my two cents on why we might feel this way: There’s a certain shame and fear that stems from being wrong, because we subconsciously attach our beliefs to our ego. Associating beliefs with emotion causes our ideals to dictate our persona, which eventually forms the self. Because of this, when you ingest some contrarian view, the rational brain does not try to dissect it analytically, rather, it spends its energy on defending the monkey brain and protecting the echo chamber that the monkey brain has created. This focus on proving oneself right in order to prove the other person wrong reveals the fragility of the ideals that we act so righteously towards.
Getting Over It: A Hypothetical Framework
With all that said, how can we learn to think better and ease out of that echo chamber? Based on the above reasoning, logically, we’d have to remove the emotional attachment to ideals. Easier said than done, especially if your ideals already define who you think you are. I don’t have the solution for this, but I do think disassociation — calling them ‘theories’ instead of ‘beliefs’ — could work, because they get rid of personal feelings that potentially blindside us to faults. That means treating your beliefs like a scientist treats his hypotheses, not like how a parent treats their child.
This leads me to the practice of probabilistic thinking or Bayesian thinking. Don’t groan at the math — it’s an art, not a science, really! It’s good to commit to the idea that believing something does not make it automatically true, and it doesn’t mean you are stuck with or defined by that idea forever. It’s a real shame that people still hold one another to things they said in the past when they didn’t know better, even if they are different now, but that’s a discussion for another time. With barriers to interaction virtually gone, the world has no business regressing to these tribalistic behaviours.
Think of it as starting from prior knowledge, a point on the spectrum, a high chance it may be true. As we collect more data through assessing evidence or what others say, the data updates our beliefs accordingly in the direction of what is more probable, bringing us closer to this truth. In fact, having our ideas challenged may actually contribute to this data-generating process.
Having them challenged means seeking out content or people with views that make you cognitively uncomfortable (in the sense that you start to question whether what you know is accurate), forcibly unwinding the confirmation bias rife in your social technology algorithms. It’s worth understanding why they think the way they do, because from there, you can assess for yourself whether they make sense or not based on the evidence, and not based on your judgement of them.
I note that this point might be somewhat controversial, because it requires a hell lot of mental fortitude to be following (and indirectly enabling) people whose opinions you disagree with, particularly if your discomfort stems from trauma or the moral nature of the content or creator (e.g. they’re racist, homophobic, sexual predators, and so on). When approaching new information, try and ask yourself why you feel such a strong aversion to what you’re reading, instead of giving in to reflexive fear. If you can’t justify your disagreement, maybe don’t close your mind off just yet — it could be a chance to learn. Whatever it is, I think it helps to let the rational brain make the decision to consume content or not, not the primal instinct. (This reminds me of the time I followed Ben Shapiro on Twitter while thinking straight, to the shock and horror of my friends.)
Now, a systematic approach will never perfectly describe a messy world. If it did, there would be no need for me to write this in the first place. It’s hard to discern what is garbage misinformation and what is contributory evidence, which is why I’m merely sharing a framework that I think makes sense and not dictating how to think critically. At the very least, when you see something you don’t agree with, try not to jump to refutations at once. Pause and check if that’s your rational brain trying to justify on behalf of your monkey brain. When I’m doing this, I remind myself that I’m not doing it to cater to the other party, I’m doing it to articulate my own thinking and that takes away most of the indignance.
It’s my hope that we can start to normalise disagreement, especially on online platforms that allow so many common folk to speak up. No two people agree 100% on everything. If you think that you know someone like that, chances are, they’re just not voicing all of their views. And if they’re not voicing a certain view, chances are, it’s not the same as yours. If we didn’t respond to disagreement so aggressively, we would not have encouraged this divisiveness and spawned new phenomena like cancel culture (not that it is inherently bad, but rather, frequently misused).
Yes, despite having written all this, I’m still occasionally guilty of preferring to talk to people who think the same way I do, judging prematurely and leapfrogging to conclusions. It’s way too easy to do so — dangerously easy, in fact. Trying to go against this doesn’t make me very fun at parties, but there’s so much more potential to the human mind that’s worth tapping into, so may we attempt to make a concerted effort to chase critical thinking and higher minds, rather than indulge our basal desires for unity and safety.