This Might Be the Best Way to Stop Being Bullshit

Have you ever been baffled by how someone can believe in a conspiracy theory that is clearly not true? Well, turn that around for a second and think about yourself: Do you really understand everything that you have strong opinions or beliefs about? Most likely you won’t. This is the ” illusion of explanatory depth “. This happens when we think we have a deep understanding of a subject, but if you asked us to explain it, we would hit a wall almost immediately.

What is explanatory depth?

As the article describing the subject says, “People feel they understand complex phenomena with much more precision, consistency, and depth than they actually do.”

This illusion can refer, for example, to what you think about political or social causes. You may be happy to volunteer or donate something, but it’s more based on feelings and opinions than a full understanding of the issue. In one study , people were asked to explain policies associated with strong political beliefs. After trying to explain them, most people were less interested in donating to the cause.

It can also refer to technical knowledge: maybe you think you understand how cars work, but in reality you just know how to perform some maintenance procedures and what the main parts are called.

To quickly demonstrate this, draw a bike, making sure it has all the main parts that hold it together and make it move. Do you know what a bike looks like? Easy. Now that you’re done, look for a photo of the bike. Most likely you have failed .

Another example: think about an exam you took at school despite having no real experience in the basic concepts. You can say something about the subject – maybe even correct – but this does not mean that you understand it. Okay, mitochondria are the “powerhouses of the cell,” but why do they exist? How big are they? Do all living beings have them? If you wanted to find a mitochondrion, where would you look in your body?

How to break the illusion

If you’ve heard of the Dunning-Kruger effect, you can see what I’m getting at: when we know little, it’s hard to realize how little we know. As noted in one of the first articles describing this effect , even if you try to explain something to yourself, the explanations remain open. You can simply repeat what you have heard about this subject, and since it has convinced you, you think you are done.

What might help more is to either discuss your explanation with a friend, or at least imagine the questions someone else will ask. What parts are missing from your explanation? How did the situation you describe come about? How do we know that what you describe is true?

How to use it in conversations

While much of the literature on depth of explanation points out how fallacious our own reasoning can be, we often find ourselves talking to someone who we’re sure doesn’t understand what they’re talking about.

It’s pretty well understood at this point that you can’t just explain your point of view to someone and expect them to fully accept it. As the saying goes, you can’t convince someone else of something they haven’t convinced themselves.

But you can always ask them to explain their theory to you . If you do this, don’t argue with their every word; not in this case. Encourage them to explore their own knowledge of the topic. Who puts 5G chips in vaccine vials? How do we know that everyone with green eyes is a secret lizard? Often a person remembers parts of an explanation, but if you ask him to connect several layers of it (what benefit will this action bring to the person who is supposed to perform it?), he will be able to recognize this gap.

This is not a reliable debate tactic, especially if you approach it as a way to win an argument. Your partner might just change the subject whenever they feel stumped, or google until they find some bullshit that seems to fill the gap. Instead, use it with someone you can be friends with. To break the illusion is to acknowledge ignorance, and this is difficult to do under pressure.

More…

Leave a Reply