There is a reliable stream of online commentary about human irrationality and why people refuse to change their mind despite what, to the author, seems like clear evidence. The culprit is sometimes fingered as cognitive biases, or social status, conspiracies or even a 'mind-virus'. While these analyses often point to real phenomena, they are almost always too simplistic. There is not some group of more intelligent people who rationally change their minds while everyone else is incapable of seeing the truth. There is something deeper going on.
Put simply, this type of analysis, in line with broader cultural assumptions, is based more on an idealised view of how we hope human belief and knowledge works than a close understanding of how it does work in practice. When we better understand human knowledge, then our behaviour is more understandable and less irrational. This essay will draw on my two most recent posts, so you may want to look at them first. I will link them below when appropriate.
An Idealised View
In our current culture, we share a dominant picture of the way humans build, refine and expand knowledge, largely based on an idealised scientist, academic or researcher. In this picture, knowledge is a collection of facts and increasingly better theories about the world. It is therefore something separate from us as humans that we discover ‘out there’ somewhere. Accordingly, the best way to find knowledge is to be a disinterested observer who puts emotion and personal interests aside and rationally follow the facts and evidence wherever they go.
In this way of thinking, we typically picture knowledge as something written, likely in a book or an article. And the ultimate aim of our attempts to build knowledge is to uncover the truth of the matter.
If we take this as the definitive picture of human knowledge - an ideal that describes what we do when we are doing it well - then the various critiques mentioned above are entirely valid. But it is nothing new as humans have always fallen short. Biographies of famous scientists, who should do this well if anyone does, are full of feuds, vanity, political powerplays, motivated reasoning and other flawed behaviour.
However, this picture of knowledge is flawed as a practical description of human knowledge. For each of us individual humans, knowledge is not an abstract set of information separate from us but the stock of reliable information that we act on without worrying. The primary goal of knowledge is to enable us limited creatures to live (well) in a complex world, which is why we need reliable information.
Truth still matters on this view, as we obviously want the information we take as reliable to be true. But it is a secondary consideration when compared to living well and survival. If we have to decide between just accepting something that seems reliable enough to act on it, or deeply investigating to find out whether it is really true, in most situations we go with the former.
Know how versus knowledge that
One sign that the 'disinterested pursuer of truth' picture is not fundamental to the way humans pursue knowledge can be seen in a common philosophical distinction between different types of knowledge. It has been long noted that there is a significant difference between knowing how to do something and knowing that something is the case.1 Knowing how to ride a bike is something very different from knowing that turning the pedals makes the bike go or that the angular momentum of the wheels keeps me upright.
To simplify things a little, know how tends to be practically focused and shows we have mastered some kind of skill. We can sometimes translate that knowledge into other kinds of knowledge, e.g. into sentences that can be written in a book. I know how to bake a cake and I can explain that in a series of know that sentences for someone else. But this is not always the case. I know how to ride a bike but I can't explain to someone exactly what they need to do. "Sit on the seat, pedal and use your balance to stay upright" is true but insufficient. You cannot know how to ride a bike without ever trying it out yourself. It is a real world skill that cannot be reduced to language.
The ‘disinterested pursuer of truth’ picture of knowledge takes knowledge that as the primary form of knowledge. By contrast, if we take the view that our knowledge exists to ensure we can live (well), then know how is at least as important, if not more. This includes instinctive skills we rely on without thinking every day, so it makes sense to include it in our stock of knowledge. In these cases, the reliable information we rely on is embodied, instinctive (once learnt) and sometimes unexplainable in language.
Harder to change
The practicality of knowledge for life, as exemplified by know how, explains a key reason why humans are, instinctively, reluctant to change their minds. Unlike in the 'disinterested pursuer of truth' picture, our practical knowledge isn't like a book we carry around in our head. So changing our minds, accepting that something we thought we knew is wrong, isn't just editing a few lines in a book. Instead, as knowledge is typically closely connected to what we do and how we live, changing our knowledge requires us to change what we do and how we live. Change carries considerable personal costs.
Using a know how example illustrates this dynamic well. Suppose someone is trying to convince me that I don't really know how to ride a bike and I've been doing it wrong. Firstly, I would want substantial evidence that they are right before I accepted they have a point. For the costs of changing how I ride a bike are significant. Not only will it require retraining myself, but suddenly I will have to think about how I ride when I ride and my confidence will likely drop. Even if there is a better way of doing it, I have ridden safely for years and years and so is there any real benefit in changing?
Changing my mind, and accepting in this case that I don’t really know how to ride a bike, carries a big cost and if there is little practical benefit then it will take a lot to convince me to do it.
There is something of a caveat to add. If I am someone who hops on a bike every now and then for a bit of exercise, it may not be that hard to convince me that what I'm doing is wrong. If, on the other hand, I consider myself a cyclist - whether it is because I commute by bike or train regularly or race - then the costs of changing are more significant. Some kinds of knowledge are more foundational to how we see the world and ourselves in it, and these vary significantly from person to person.
While this whole example might feel a little silly, the dynamic I described should feel familiar. When faced with evidence that what we think of as knowledge might be wrong, our attitudes tend to play out in a similar way. If we can find a way out, we resist change and we often implicitly weigh up the costs. While this sounds like we are just being stubborn or lazy, it makes sense if we remember our limitations.
As noted previously, we humans have limited time and energy to use in our lives and there are many competing demands. We accept some things as reliable information, or knowledge, as a necessary energy saving measure, as we often need to act without having to seriously think about it. If I have to take something out of that bucket of reliable information, it will need to be replaced and I’ve lost the energy saving benefits of being able to trust that information.
There are real trade-offs for my life in changing my mind that are often practically focused. This means there are circumstances where it can be highly rational to not change your mind when faced with contrary evidence. The energy and time demands for doing so outweigh any practical benefits.
This doesn’t justify being stubborn. Nor does it suggest you need never change your mind when provided with good evidence. The point is to show that change is not a straightforward process that depends only on the comparative evidence between different positions. The underlying dynamic is nicely illustrated by the way we are less willing to change our mind on certain types of knowledge.
To pick an example, I can be quite easily convinced that what I know about the Aztec Empire is wrong. I have nothing invested in the topic and changing my mind isn't going to change anything practical in my life. Convincing me, however, that what I know about nutrition is wrong is likely to be much harder. For one, I have decades of experience of eating different food. But also, if I change my mind it will likely have wide practical personal consequences. I may have to change my eating, cooking and shopping habits and relearn a whole list of things. The costs of changing my mind are higher, and so I will likely be more reluctant to do it.
We are practical creatures
To sum up, knowledge for humans is primarily practical and focused on providing the reliable set of information for living. We are wired to make decisions about knowledge pragmatically and if the benefits outweigh the costs, calculated in terms of energy and practical impacts on my life, then we are unlikely to change our minds.2
So next time you are wondering about why someone won’t change their mind when they should, try considering the wider effects the change would have on them as a person in their context. The perceived costs, which may be time, energy, social, financial, inter-personal or others, likely outweigh, in their mind, the benefits. This calculation is unlikely to be conscious or considered, but it doesn’t mean it isn’t sitting in the background.
This may, in turn, give you some more ideas about how to change their mind. Focusing on these practical aspects of knowledge can shift how we argue and interact with others, but that will have to be a topic for a later post. It is also important to think about how this all applies to each of us personally. We are each subject to these dynamics and so I may not be holding on to beliefs for the reasons I thought I was.
For completeness, I should note there is a third category of knowledge. To know someone is something different again. If I know Sam, then it is something more than knowing lots of things about Sam. It is something about knowing what they are like, how they will react and what they are likely to do.
Of note, social acceptance has a high value for survival and quality of life, so adopting beliefs as knowledge based on your social group can make sense as a pragmatic decision.
Nicely tied together. The know how, know that construct is very neat. Maybe it is worth explaining the relationship with know why, and how this might be connected to our impression of expertise.
Significant potential implications flow from your thesis. One relates to the ways government interacts with citizens, and the assumptions they make around knowledge is transferred. The question of government as a trusted information provider is an important one for democracy.