Nobody describes it better than Richard Feynman:
In the South Seas there is a Cargo Cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they’ve arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas — he’s the controller — and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land.
Despite this, the cult persisted and kept trying new approaches. The cargo cult never questioned their strategy of mimicry. In the face of failure, rather than say “maybe we’re misguided in trying to build an airport out of bamboo,” their response was “maybe we just need to wait it out” or “this next thing should do the trick.”
Humans in general seem to fall into this pattern. When something happens which goes against our beliefs, rather than instinctively updating, we double down on protecting those beliefs. Some members of an apocalypse cult in Chicago, after the world didn’t flood like they thought it would, only became more fervent and pushed for increased publicity.
But this sort of thing doesn’t just happen in literal cults. Highly intelligent, accomplished people dig themselves deep into mental holes. Charlie Munger criticized an American doctor who was convinced that removing (healthy) gallbladders would cure people of all diseases. Despite mounting evidence to the contrary, the doctor continued his medical malpractice and eventually had to be forcibly ousted from his hospital.
Cargo-cult thinking means only looking for, and only accepting, confirming evidence. When something happens which goes against your belief system, you rationalize and gaslight yourself until your beliefs are safe again.
This mental pattern is to be avoided. At the very least, you’ll waste time doing ineffective superstitious rituals. At the very worst, you could cause harm to yourself and others.
The problem is: not only does it feel painful to challenge your beliefs, but your strongest beliefs are so ingrained that they’re invisible to you. It’s easy to see a crack in a building’s wall, but not in a building’s buried foundation.
Understanding cargo-cult thinking is not a solution. The solution needs to be an actionable habit. It’s easy to understand fallacies and biases at an intellectual level. But at a practical level — changing our thought patterns and behavior to be less wrong — being able to recite textbook definitions does nothing.
Here’s the actionable habit: ask yourself, regularly, “what would convince you otherwise?” As a thought experiment, let’s imagine how the cargo cult, apocalypse cult, or gallbladder doctor would respond to this question:
- “How could we be wrong? There’s so much stuff to get right. One of us just remembered how the Americans would walk with rifles. So we’re going to make guns out of sticks and march around — we’re optimistic that this was the missing piece!”
- “Ha! My faith in the UFO flood is absolute. The only reason it didn’t happen last time is because they needed us to survive to spread the message further. Can you introduce me to a reporter? I’d like my story to be published.”
- “The gallbladder is a sickening organ. I’m a credentialed doctor who’s spent his entire life studying medicine — are you suggesting that you know more about this than I do, or that my decades of painstaking work were all mistaken? Get out of my office.”
If you, honestly, can’t be convinced otherwise about something, it’s a sign that you might be cargo-cult thinking. If you’re asked “what would convince you otherwise?” and find yourself floundering, either:
- Figure out what would genuinely convince you otherwise, or
- overhaul/discard whatever belief is making you flounder; why would you want to act like a cultist?
Personally, it’s been uncomfortable to do this. I’ve found that 80-90% of my thoughts can’t stand up to the scrutiny of asking myself, “what would convince you otherwise?” And when doing something is painful, it’s easy for the mind to stealthily sweep it under the rug.
But in the spirit of this essay, what would convince me otherwise about its main thesis (that you can detect cargo cult thinking by habitually asking, “what would convince you otherwise?”)?
- One failure mode would be if we only asked this question in ways that, regardless of the answer, persisted our fundamental beliefs: e.g. if the doctor asked questions like “I believe that this scalpel is better for gallbladder removal than the other one, but what would convince me otherwise?” but never “I’m removing gallbladders because I believe that the gallbladder is a disease-causing organ. What would convince me otherwise?” You need to focus the question not on what you do, but on why you’re doing it.
- Another easy-to-slip-into failure mode is if you only pay the question lip service — if you detail what would convince you otherwise, but when that thing happens, you either ignore it or revise what you said earlier.
- I’m sure there’s a better question to ask or a better overall approach. It’s overwhelmingly unlikely that this first pass at a detection system is the best one. But given how sticky beliefs tend to be, I’d probably only be personally convinced that a better approach exists by a trusted thinker or scientific study.
As a side-note, using “what would convince you otherwise?” might be a winning move in internet arguments. You’ll severely fluster most opponents as they realize their arsenal contains only confirming evidence. And then you can hit them with, “nothing can convince you otherwise, so you’re cargo-cult thinking and therefore not believable.”
In a dream world, if usage became so commonplace that the acronym “WWCYO?” popped up everywhere, online debaters would be forced towards broad-mindedness if only to remain competitive. Of course, this is just a fantasy.
stay in-the-loop, check out my twitter!