You’re reading part of The Collapse of Knowledge, a long-form series about what happens when certainty stops working.
These chapters can be read on their own, but they’re also part of a deliberate sequence, starting with how trust in expertise quietly fractures and ending with how to live without needing to be right. If this piece resonates, you may want to begin at the start or explore the full index.
Read the full series here: leehopkinswriter.com/the-collapse-of-knowledge
Once upon a time, I believed in facts the way other people believe in gravity: utterly, unconsciously, and with the profound confidence of someone who has never been weightless.
This was not a character flaw. It was excellent training.
Since the mid-1990s, I operated under the delicious certainty that reality had rules, that those rules could be discovered, and that once discovered, they would stay discovered. Science was not just a method, it was a moral architecture. A cathedral built from peer review, replication, and the quietly revolutionary idea that the universe might eventually stand still long enough to be properly held.
The universe, as it turns out, has other plans.
Picture a thirty-four-year-old version of me discovering Karl Popper’s philosophy of science like other people discover sex or good coffee. The idea that proper scientific claims must be falsifiable felt like the most elegant solution humanity had ever produced.
You couldn’t simply claim something was true. You had to specify how it could be proven false. Astrology failed immediately. Homeopathy collapsed under the slightest scrutiny. Cognitive behavioural therapy for anxiety disorders, on the other hand, offered testable hypotheses, measurable outcomes, and the real possibility of being definitively wrong.
That possibility was intoxicating.
I threw myself into this framework with the zeal other people reserve for religion or CrossFit. Every claim had to survive the falsifiability test. Every theory had to specify its own conditions for defeat. If you couldn’t imagine evidence that would change your mind, you weren’t doing science, you were doing ideology with lab coats.
For decades, this worked beautifully.
I could spot pseudoscience from several suburbs away. Crystals aligning chakras, motivational speakers promising reality creation through thought alone, political certainty disguised as policy. My bullshit detector was exquisitely calibrated, and I was proud of it.
In hindsight, that pride may have been the first crack in the foundation.
If falsifiability was the philosophy, peer review was the practice. The messy, occasionally vindictive, profoundly human process of having other professionals interrogate your work.
I loved it.
No single person could declare truth by authority. Everything had to pass through the gauntlet of trained scepticism. Yes, Reviewer Two was terrifying. Yes, the comments could be brutal. But that brutality was comforting. It meant the system was alive.
When I submitted my first paper in 1997 and received “major revisions required”, I felt relief rather than disappointment. Not rejected. Not accepted. Just human. Fallible. Correctable.
That felt like science.
Replication, though, was where my faith really settled. No single study could overturn reality. False positives would fail to replicate. Signal would remain after noise cancelled itself out.
I loved discovering failed replications, especially when they revealed cultural assumptions embedded in supposedly universal findings. Leadership research that didn’t translate to Australian contexts. Motivation models that collapsed outside North American corporate structures.
Knowledge wasn’t collapsing. It was refining.
We were building something larger than ourselves. Provisional, yes. But cumulative. Something that would outlast careers and egos and still move humanity forward.
Scientific training in the eighties and nineties was systematic indoctrination, and I mean that kindly. We learned not just methods but moral restraint. Statistics became a form of prayer. Correlation does not imply causation. Acknowledge limitations. Always acknowledge limitations.
My limitations sections grew longer with every paper.
This felt like intellectual maturity. Humility sharpened by rigour. Underneath it all was a deeper certainty that the method itself was sound, even when individual studies were flawed.
And then came the dopamine hit.
In 2007, researching workplace stress in Royal Australian Air Force personnel, my data did something unexpected. Senior officers were more stressed than junior staff. According to the literature, this shouldn’t have happened.
I assumed error. I rechecked everything. The data held.
Autonomy, it turned out, wasn’t protective when it came bundled with responsibility for other people’s lives. This wasn’t revolutionary science, but it mattered. It contributed. It nudged understanding forward.
I was hooked.
What no one ever said aloud, but everyone seemed to believe, was that if we followed the method long enough, truth would eventually stand still long enough to be held. Not total truth, but reliable partial truths. Knowledge you could build lives on.
Science promised structure against chaos. Meaning against superstition. A way to live with uncertainty without drowning in it.
And for decades, it delivered.
Vaccines. Surgery. Psychology. Organisational behaviour. Evidence-based progress that improved lives in measurable ways. Why wouldn’t I trust it?
Being a scientist wasn’t just what I did. It was who I was.
I was the one who asked for data. The one who explained why anecdotes weren’t evidence. The one who corrected dinner-party misconceptions with the enthusiasm of a missionary.
I thought I was helping.
What I didn’t see yet was how easily epistemological confidence turns into moral superiority. How method becomes identity. How being right becomes intoxicating.
I believed I had transcended ordinary bias.
That belief, it turns out, was the most dangerous bias of all.
This is where the story begins. Not with ignorance, but with certainty. Not with stupidity, but with intelligence that forgot to doubt itself.
The Collapse of Knowledge – Series Index
- My upbringing in certainty
- The seduction of being right
- The first cracks
- Leaving the West without leaving Western thinking
- When uncertainty becomes livable
- The collapse of trust
- When knowing stops helping
- The exhaustion of vigilance
- Learning to trust differently
- The hunger for authority
- YouTube University
- Identity after certainty
- Performance replaces understanding
- The comfort of camps
- The quiet seduction of certainty
- What survives the collapse
- Living without needing to be right
