The seduction of being right

You’re reading part of The Collapse of Knowledge, a long-form series about what happens when certainty stops working.

These chapters can be read on their own, but they’re also part of a deliberate sequence, starting with how trust in expertise quietly fractures and ending with how to live without needing to be right. If this piece resonates, you may want to begin at the start or explore the full index.

Read the full series here: leehopkinswriter.com/the-collapse-of-knowledge

There is a particular kind of smugness that comes with possessing superior epistemological hygiene.

It is not unlike the satisfaction a cat feels when it has successfully ignored you for precisely the right amount of time, long enough to make you question whether cats actually like humans, or whether they are simply running an extended con involving tuna and warm laps.

Except in this case, the cat was me.

Being right feels good in the body. It produces a small chemical reward, a faint dopamine lift, a subtle straightening of the spine. When someone makes a claim that cannot be falsified and you point this out with calm authority, your nervous system registers a win.

Rightness, as far as the brain is concerned, is rightness. It does not care whether you are correct about statistical methodology or the location of your car keys. The reward is the same.

For decades, I lived on that reward.

At first, the pleasure was legitimate. Using proper methods to discover something true about the world really does feel wonderful. It means you have contributed, however modestly, to the collective human attempt to understand reality.

Then came recognition. Citations. Nods at conferences. The quiet respect that follows people who speak fluently about evidence and uncertainty.

This is where things quietly go wrong.

The human brain is a social organ disguised as a thinking machine. Once it learns that sounding scientifically competent brings approval, status, and belonging, it begins optimising for those rewards rather than for truth itself.

You start looking for opportunities to demonstrate your scepticism. You develop an instinct for spotting bad reasoning. You become fluent in the intellectual eye-roll.

Someone mentions acupuncture helping chronic pain and instead of asking about their experience, you explain the difficulty of blinding trials and the prevalence of publication bias. You are not wrong. You are also not helping.

Truth-seeking has quietly been replaced by identity-maintenance.

This is how scepticism stops being a tool and becomes a costume.

Real scepticism is uncomfortable. It involves admitting uncertainty, changing your mind, and sitting with ambiguity longer than you would like. Costume scepticism is far easier. You just need to know which opinions are socially coded as “scientific” within your tribe.

Climate change is real. Vaccines are effective. Homeopathy is nonsense. All true. All settled.

But the costume fails the moment reality presents something that is not already pre-labelled. When uncertainty is genuine rather than performative, the sceptic costume offers no guidance at all.

It gives you authority without curiosity.

I believed my scientific training had made me immune to bias. That I was rational where others were emotional. Objective where others were tribal.

This, in hindsight, was breathtaking nonsense.

I was tribal. I had simply joined the tribe of people who pride themselves on not being tribal. I was emotional. I had just learned to feel morally superior about my emotions. I was biased. I had simply become better at justifying my biases with footnotes.

The scientific method is powerful. Scientists are still human.

Give a human a reliable way to be right and they will eventually use it to feel righteous.

The academic system does little to prevent this. Grants reward confidence. Conferences reward definitive conclusions. Careers advance more easily for people who sound certain than for people who remain openly uncertain.

We say we value humility. We reward performance.

Social media poured petrol on this dynamic. Suddenly, being clever in public came with instant reinforcement. Thoughtful uncertainty sank quietly. Confident takedowns went viral.

The internet filled with experts performing intelligence for applause.

And I joined in.

The moment I realised something had gone badly wrong came when I noticed I was citing research without checking whether it actually supported my claim. I assumed it did, because I was the kind of person who understood research properly.

This is exactly the kind of motivated reasoning I had spent years teaching others to avoid.

I had started with the conclusion that I was right and worked backwards to the evidence.

The difference between truth-seeking and identity-protection is subtle but devastating. One is open to surprise. The other cannot afford it.

Once your sense of self depends on being correct, learning becomes dangerous. Questions become threats. Doubt becomes personal failure.

And it is exhausting.

Maintaining the appearance of rightness requires constant vigilance. Every opinion must align with your established intellectual persona. Every new piece of information must be filtered not just for truth, but for reputational safety.

Eventually, you stop caring whether you are right at all. You only care whether you look right.

That is when sophisticated ignorance sets in.

Knowing just enough to sound authoritative, but not enough to recognise how much you do not know, is far more dangerous than simple ignorance. It comes with confidence and applause and very few brakes.

I had mistaken intellectual performance for wisdom.

The addiction to being right had served me well. It had built a career, a reputation, and a sense of moral clarity.

But it had also made me rigid, brittle, and quietly incapable of learning anything genuinely new.

The choice, when it finally arrived, was not between truth and falsehood. It was between comfort and honesty.

And comfort, it turns out, is a very convincing liar.


by

Tags: