Never forget that cognitive bias is real. That applies to you and everyone else. Notice when someone is oversimplifying things or taking shortcuts in their thinking to make sure you get the right information.
In my last essay, I talked about the importance of thinking like a scientist to make sense of our world using Tim Urban’s amazing Wait, But Why? I left you on a bit of a cliffhanger, so let’s get into it.
To think like a scientist, we want to take in as much information as possible from different perspectives. I’ve already talked about diversifying your media diet to gather various opinions. Another great tactic to get new insights? Look for a counterargument. Find someone willing to say the opposite of what you're saying.
Daniel Schmachtenberger talks about the idea of signal versus noise. Noise is all the fluff, while signal is real information. His “Rule Omega” assumes there's always some signal, reason, or truth in what people say. Finding the signal in the noise is really important, and it's useful to get curious and ask questions to gather more information.
Once we've taken in a bunch of information, we need to figure out what we’ll allow through our belief gate. Now, we want a strict bouncer at this gate to ensure only the best information comes in. Here are some ways your bouncer might check if it’s a worthy belief.
Get references
Tell other people about your idea and say, “Hey, I heard this; does this make sense to you?” Find someone who has some lived experience with it and get feedback.
As we're asking other people for their opinions, there are a couple of things to remember about who we should trust and why. We know there's way too much information out there. So we have to decide whether we’re going to listen to this expert or that expert. Who are the best people to trust?
Dunning-Kruger effect
We need to watch out for the Dunning-Kruger effect. This is where people who know very little about something tend to have a lot of confidence in their knowledge. People who are experts in something tend to know that they don't know everything.
If someone is really confident about what they're saying and is sure it's the truth, you might be more skeptical about that person. If someone is speaking with a lot of caveats and nuance, you might assume they actually know what they're talking about.
Look out for the groupthink
Another thing to consider is the concept of cultural magnets. I’ve spoken about it before here. We tend to think like the people around us, so we often believe what they do. If you ask people around you, and they all confirm your beliefs, you might be stuck in groupthink.
Sometimes the group's desire for harmony and conformity is so strong that people can’t resist agreeing with each other. Groupthink happens at the cost of critical thinking and rationality. So it’s a great idea to ask for opinions outside your group.
If you’re trying to take in new information and change the way you understand something, you might be fighting against the stream. Curiously trying to find the truth can pit people against you. Groups often eschew dissenting voices, so that’s a risk we take every time we speak against the group’s way of thinking.
It's worth noticing how a person thinks or believes. Are they thinking that way because that's what everyone around them thinks? Or have they actually thought about it, considered good data, and followed scientific processes? Maybe cognitive biases are at play, and they’re not thinking clearly.
How can we check our biases?
I have a rule of thumb when I’m acquiring new information. If I learn something new, and it feels great and super clear to me, that's a cue for me to question why it feels good.
Why do I like this?
Is it too simple?
Is it confirming what I want to hear?
Taking in new information means being open to a little challenge sometimes.
Disillusionment is hard. We can feel disappointed when we discover the way we see things aren’t true. I like the idea of disillusionment being the undoing of an illusion. Uncovering the truth is something we should celebrate.
Living according to your beliefs
Another thing you need to consider when testing your ideas is whether you’re getting good results. It's really worthwhile for you to update your knowledge when you get new data. But what if you're running into resistance and getting into trouble when you act based on your beliefs? You should probably question whether that's the most accurate way of thinking.
As we consciously manage our wisdom, it becomes a lot easier to take in new information because we can track where our knowledge came from. What we're really aiming for is something Tim Urban calls the humility sweet spot.
The humility sweet spot
As you learn more, you can have more conviction. If you don't know that much, you should have less conviction. If we have more confidence than knowledge, this is where we get arrogant and lose humility. This plays into the Dunning-Kruger effect.
One way to think about humility is being honest about what you do and don't know. What does that look like?
“Hey, this is a hypothesis that I'm working with.”
“I haven't actually done the research on this.”
“As far as I know, this seems to be true.”
“Based on the research I did last week, this is what I believe.”
“I’m not sure where I heard this, so it may not be totally accurate.”
Another way to think about your beliefs is to realize that while you may believe in 80%, you don't quite understand the other 20% yet.
And if you’re still wondering why we need to think like this, take another look at the thinking ladder from Tim Urban and see the difference between primitive and higher minds. While we need elements of both, they need to be in balance with each other.
Are we searching for truth or the comfort of being certain about the world?
The scientist is in that sweet spot, and there’s one question that matters to them:
Is it true?
They'll take on any information to figure out if their theory is right. They fight the urge to accept information that feels good to them, and dissent doesn’t trouble them. They seek it.
Do you accept new information easily because it feels good and confirms what you believe? You're probably getting skewed by wanting the world to be how you think it should be rather than trying to find out what's going on. While that’s only human, overcoming our biases is vital for true sensemaking, understanding, and connection.
What’s coming up?
The next essay will discuss caveats about science and the scientific process. The truth is, scientists are human, just like us. So they're also subject to bias and guess what?
Science isn't right; it's just a process.
Ready to learn more? Don’t miss next week’s essay.
Know someone who’d love to think like a scientist? Why not share this post with them and spread the word?
Prefer to watch your content? Check out my video on the topic of this essay:
Have you subscribed to the Omni-Win Project Podcast yet? Find it on your preferred platform and join the co-creation journey. Don’t miss your chance to make a difference.
Check out the Omni-Win Project website and follow me on other platforms.
Want long-form content, authentic discussions, and weekly videos?
Subscribe to my YouTube channel
You can schedule a call with me here: https://calendly.com/duncanautrey