Over the last few decades, there has been an explosion of research into weaknesses in human reasoning and knowledge formation. Wikipedia, for example, now lists over 200 cognitive biases and over 150 common logical fallacies. Despite all this interest, it is hard to discern any improvements in the quality of human reasoning or debates, even in more formal research or academic contexts. One problem is that the approach underlying much of the research - identifying and testing for as existing and new biases and fallacies - does little to help us reason better. Being aware of any particular bias is no guarantee that you will avoid it, even if you could keep all the relevant biases in mind.
The humble approach to knowledge that I have been describing offers a natural alternative explanation of where human reasoning often goes wrong. This is a result of the structure of human knowledge, not any inherent flaws, and leads to a simple principle that can help us reason better. This does not invalidate the research on biases and fallacies, but points to underlying dynamics that give rise to many of them.
In my previous post on biases, I noted that work on biases and fallacies often presupposes an idealised model of a hyper-rational thinker which is both unrealistic and, at times, unhelpful in the real world. Nevertheless, when we look at both biases and the logical structure of human knowledge, we can see a strong temptation towards hubris. We overrate the reliability of what we think and confuse our models or theories of the world with reality itself.
Our hubris: confusing theory for reality
As a reminder for many readers, there is significant evidence that we do not have direct knowledge of the world. Instead, we form theories, stories, models and pictures of the world (abstract representations) that we then compare to the world. We accept as knowledge those that stack up as reliable descriptions, predictors or explanations of the world - even though they may turn out to be later wrong. This is both the dynamic underlying the scientific method and how we make sense of daily life.
This account, however, needs a significant caveat. It is more a description of how human knowledge processes should work, rather than how they typically work in practice. It is fair to say that we are always using and forming theories and pictures of the world. The problem is that the testing or comparison process doesn't always happen.
Very often, and we are all prone to doing this, we accept that a theory or model is accurate and then take it to be a perfectly reliable description of the world without seriously testing it. In other words we assume our ideas are reliable and take our theories or pictures as perfectly accurate descriptions of reality. We confuse our theories with the world, or mistake the map for the terrain.
We see this dynamic, to pick one example, in the endless debates between opposing sides who won't accept any of the evidence the other side provides. To pick one example, debates between socialists and those committed to completely laissez faire economics exhibit these dynamics. Each side knows that history (i.e. evidence the real world) has shown the other's views don't work, but can't accept the same type of argument might apply to their side. The problem is that they see the world through the lens of their theory and likely cannot imagine that the world is different.
Along similar lines, readers can likely bring to mind personal examples where someone you know won't accept some information you are providing them, perhaps about a friend, an investment or a decision they are making, because it doesn't fit their picture of what the world is like. And, of course, we have almost certainly done the same thing ourselves, it is just easier to see in someone else.
A confusion easily explained
So, for us as humans, if we have to choose between trusting a theory we hold and going with evidence about the world we've received, we'll often stay with trusting our theory. This dynamic underpins confirmation biases, hindsight effects, attribution biases and various other cognitive biases. We react to evidence based on what we already believe is the case, rather than judging it for what it is. And while this might seem like laziness or arrogance, there are good philosophical reasons why it happens to everyone.
The first is that we build our knowledge out of theories, models and pictures, not facts. This means that our theories are so foundational to us that we depend on them to conceptualise and interpret the world. In important ways, our theories shape what and how we see, touch and hear. This is most obviously true in language - how our native language categorises colours, for example, shapes the similarities or differences we see between different objects.
But it also happens in a broader variety of contexts. To pick a relational example, it is often incredibly hard to accept that a friend, or a work colleague, or a boss, that you trust is lying or cheating. To do so requires changing how you understand yourself and see yourself in the world.
Or we can look to a famous example from the history medical science. Ignaz Semmelweis famously discovered that disinfecting hands and equipment drastically improved survival rates after surgery and childbirth, but his discovery was ridiculed and shunned for decades. This was an empirical discovery that required doctors and medical professionals to fundamentally reconsider how they understood disease and the human body - and Semmelweis didn't have a good theoretical explanation to explain it. So doctors continued to believe their theories over the evidence.
As our theories are foundational to how we experience and act in the world, accepting evidence that our theory might not be true can be profoundly disorienting. It is therefore natural, even if not justifiable, that we avoid changing our theories if we can.
A second reason is that, to survive and succeed in the world, we have to act based on the theories we currently have. It is unreasonable, and at times dangerous, to wait until we have fully confirmed a theory before acting on it. If I'm out hiking and think I saw a bear on the path ahead, it is most likely not wise to investigate further to make sure it is actually a bear before taking action.
We cannot live well if we seriously entertain every possible piece of evidence that goes against what we currently think. We will end up questioning everything, all of the time. Practically, it usually makes sense to act as if we have an accurate theory of the world and the world is as we think it is. And in ordinary, day-to-day situations, this is reliable enough to help us live and succeed. Thus there are practical hurdles that evidence needs to overcome for humans to accept them as requiring a change in view. It is not surprising that we tend to make these practical hurdles as high as possible.
A third dynamic is that, as noted elsewhere, there is no neutral position from which we can evaluate theories. Very often, we can only test a theory against the world by buying into its truth and seeing how it works out. We have to think and even act as if a theory is true to find out whether it is actually true. This process, which tends to get derided as acting on faith rather than facts, is central to how we test many scientific theories.
Many consequences of, for example, general relativity or quantum mechanics have been taken to be true before any experimental confirmation was possible. Or to switch to a more prosaic example, to work out whether a particular diet will work, I have to take it on faith and live as though it is true.
These reasons show how, at a foundational, philosophical level, there are good reasons why we tend to mistake our theories for reality. We experience the world through our theories and we often have to take them as completely reliable to make decisions or even figure out if they are, in the end, true. These reasons are, however, explanations not justifications.
Social dynamics further the confusion
In addition to these philosophical explanations, there are also various social reasons why we tend to buy into the reliability of our existing theories - and confuse our theories for reality.
For one, we are social animals and adhering to group norms has many social benefits. Most people are therefore often reluctant to oppose the theories and pictures that their social group accepts, regardless of what the evidence says. Related to this is the way that our theories often provide a common language by which social groups communicate and act in the world. If someone rejects or questions a common theory within a group it increases friction in communication and can undermine group cohesiveness, unless the whole group is changing its mind at once.
Another factor is that, in most group and social situations, inter-personal dynamics favour (at least in the short term) people who are confident and sure. Not admitting, or seeing, contrary evidence gains you social advantages and can easily lead to you mistaking your theories for what is real.
While these dynamics are likely universal to human socialisation, there are some particular issues in our current societies. As argued previously, we live in a society where cultural norms are based on the the assumption that epistemic certainty is possible, and often desirable. The claim to epistemic certainty, that my knowledge is a perfectly accurate, is in effect a claim that the world is exactly like my theory says it is. This means that our culture encourages us to conflate our theories with the reality of the world, as it assumes that we are able to achieve perfectly reliable knowledge.
A simple principle
At the start of this post, I promised a simple principle for helping us reason better. However, if you are looking for something mind-blowing or that you’ve never heard before, I’m afraid I will have to disappoint you. Our response to this epistemic weakness needs to be the obvious one:
Given what you think is not the same as reality, regularly and actively check your knowledge and ideas against the world.
This principle is encoded in common wisdom across many cultures and contexts: check your sources; confirm information you are given; rely on multiple witnesses; engage with people who don’t agree with you; and so on. Yet, being simple doesn’t mean it can’t be effective.
For most of this post, I provided a long list of reasons why we almost inevitably fall into the epistemic trap of mistaking our theories or models for reality. Our response to this needs, therefore, to involve regular vigilance and discipline. We have to keep coming back to check our knowledge if and when something new pops up.
But the philosophical dynamics covered above provide some guidance on this. We need to come back to check our thinking fairly regularly but we cannot, and should not, continually be revising all our beliefs. For one, we often need to act and have to make a call. But, more than this, sometimes we need to do something or live out something to be able to check our beliefs. Knowledge acquisition isn’t a purely intellectual exercise we can do in an armchair or in a lab.
If we accept this principle of epistemic humility, and acknowledge that we often do mistake our theories for reality, we will be much more open to updating our knowledge when new evidence suggests we should. In this way, we can reduce our biases and improve our reasoning.
A neat drawing together of your themes. It makes a good case for engaging with diversity in thinking and decision making systems.