When we learn things, we generally acquire a superficial level of knowledge about them. For example, we learn that leaves turn color in the fall, and perhaps we understand that that’s because they stop their food-making process. The chlorophyll breaks down and since chlorophyll is what makes leaves look green, the leaves now take on a yellow, orange or red hue.
But how many of us truly understand what’s happening – how the chemical reactions inside the plants actually work, or why certain years have brighter colors than other years? Most of us just know that the leaves get prettier and that they eventually fall off in winter.
Or take the motions of the planets around the sun. We know that gravity is at play, and a lot of us can recite the equation: e = mc2 — energy equals mass times the speed of light squared. But we don’t really understand exactly what that means.
It isn’t important for us to know the details unless we’re botanists or astrophysicists. It’s enough that we have a passing understanding of the overall picture. We don’t need to understand everything at a deep level. In fact, trying to reach that level of understanding would probably reduce me to a quivering blob of anxiety and uncertainty as I wondered if in fact I know what I think I know.
However, we often think we know more than we do. We assume our understanding of the world is correct, even when we derive our information from somewhat questionable sources. This isn’t generally a problem when it comes to processes that aren’t central to our daily lives. Not understanding the specifics of planetary rotation makes little difference to how we conduct our days.
But it can be a problem when we get into things that have a greater effect on how we move through the world. Choosing to believe outliers when it comes to the matter of vaccines, for example, or lies about election insecurity, harm our society as a whole.
Listening to people who think like we think, who say what we want to hear, and assuming they’re right because that’s what we want to believe – that’s dangerous. We haven’t necessarily absorbed knowledge. It’s possible we have, of course. It’s always possible that these speakers have enlightened us. But it’s more likely that we have instead learned something that’s only partly true or maybe even completely false.
And because we have only superficial knowledge about many things related to how the world works, we don’t have the ability to discern whether what we’ve heard is actually true unless we take the effort to dig into the sources a little deeper.
When 3 scientists are telling you climate change isn’t happening and even if it is, it isn’t a big deal, and 97 scientists are telling you it’s real and it’s a big deal, who do you believe? The 3 or the 97?
When 3 doctors tell you Covid isn’t a problem and 97 doctors tell you it is, who do you believe? The 3 or the 97?
Unless those 3 are somehow magically unique, we ought to believe the 97 – especially if the 97 are specialists in the field at issue. But instead, many of us head for social media and Google, convinced we can get all the deep learning we need to reach the decision we want to reach, and then we stick to that decision regardless of any inconvenient “facts” to the contrary.
So, what’s the answer? We need to think like scientists, continually testing what we believe against thoughtful, reasoned opinions to the contrary. We need to examine what the likelihood is that a given position is true based on the quality and amount of the scientific opinions behind it. If we can do that, we can get to a level of knowledge that’s a tiny bit greater than superficial.