Structures also make it hard to change our mind
Understanding how knowledge is structured reveals more of the costs that come with updating our views
In my previous post, I argued that, if we take a practical, internal perspective on human knowledge, we can understand better why humans resist changing their minds about what they know. Practically, for each of us, our knowledge is the reliable set of information that we can act on without having to worry or think. Changing this set of information can have significant practical life consequences and costs us time and energy. It is therefore reasonable, although often not justifiable, for us to resist changing our minds - so long as our knowledge is good enough for daily life.
These practical considerations, however, aren't the only reasons why we resist changing our minds. The structure of knowledge and patterns of justification involved have similar effects. One core reasons is that our knowledge isn't constructed out of discrete facts or pieces of information. Instead it is built out of collections of theories and abstractions that are necessarily inter-connected in various ways. Changing our mind about one thing therefore often has flow on consequences across other areas of our knowledge, which naturally raises the stakes and justification required for change.
A couple of analogies might help explain this. If our knowledge is primarily a discrete collection of facts, then the structure of our knowledge is rather like an encyclopedia or a dictionary. Updating our knowledge about something simply requires us to update an individual entry or two, and has minimal broader consequences. If our knowledge is, instead, an inter-connected collection of theories and abstractions, its structure is more like a large spider web. We cannot easily make changes anywhere without it having consequences for other parts of the web - and so we are naturally more cautious about doing it.
The inter-connections between different areas of knowledge take a range of different forms, each of which provides reasons why we might not change our minds. We will look at four different types of inter-connection.
Mutually dependent theories
One obvious way that different parts of our knowledge are inter-connected is that many fields of knowledge rely on other fields. Many sciences rely on mathematics. Modern political analysis relies on statistics (via polling). Counselling or psychology often relies on nutrition (food affects mood). We could expand the number of examples indefinitely. As the theories we consider as knowledge rely on other theories, any changes to one field of knowledge can have significant flow on ripple effects across many fields of knowledge. This increases the costs of change, especially if we are uncertain about the consequences, and feeds our reluctance.
Let's consider a historical example that I have looked at previously. In the 1840s, Ignaz Semmelweis discovered, and demonstrated, that hygiene measures like washing hands and sterilising equipment significantly reduces death and illness during childbirth and operations. However, it took decades for this discovery to be taken seriously and adopted.
This is often told as a story about the irrationality of experts and our stubbornness in the light of new information. However, it is worth looking more closely at the dynamics at play, not to justify the decisions, but to understand where and how we might be subject to the same pressures. If we can understand why people refused new knowledge in the past, we will hopefully be more open to it ourselves.
To simplify, the dominant theories of disease in the 1800s were either that disease originated from ‘humours’ in the body or that it came from ‘miasms’, roughly ‘bad air’. These theories had thousands of years of theory and practice behind them and their insights had been incorporated into everything from medical practice, to architecture and city planning. The discovery that physical hygiene, like hand washing, affected disease did not make sense within either of these paradigms. To accept the results Semmelweis had discovered would have required up-ending a vast, connected set of theories of health and life. Such a consequential change in knowledge would have been literally unimaginable for most medical practitioners and it simply would not have made sense. For those who could imagine it, such an extra-ordinary change required extraordinary justification.
As many interconnected theories that governed the way that medical practitioners thought about health and disease all had to change if Semmelweis was right, it took decades for his results to be taken seriously. While it is easy from our vantage point in the future to criticise the medical establishment for thousands and thousands of preventable deaths, we should instead be wondering which of our own theories are so deeply ingrained and interconnected that we wouldn't be able to give them up if there was evidence against them.
Inter-connections of justification
While the connections between different theories or fields of knowledge that explicitly rely on each other are often fairly clear, there can also be different areas of our knowledge that are connected only via common sources of justification. The most obvious way this happens is when we accept a range of different knowledge from a singular authority - it might be a parent, a teacher, an expert, a book, or an authoritative web resource.
All of us have had the experience of discovering that an authority figure we trusted didn't know nearly as much as we thought they did, and we couldn't rely on everything we learnt from them. This can be highly unsettling, especially if there are many different areas of knowledge we had relied on the same source for. Suddenly our justified loss of trust has flow on effects across a series of unrelated areas of our knowledge - we have to question and double check all sorts of unrelated information.
The same dynamic likely played out in nineteenth century medicine. If Semmelweis was correct, then all the medical schools and medical textbooks were wrong on one important point. If taken seriously, that would have opened up a Pandora's Box of questions: how much of anything that had been taught by these authorities was actually correct? Or was it all similarly flawed? For individuals, and communities, who had built lives and careers around the knowledge and advice provided by these authorities, questioning such authorities would have often been unthinkable. Suddenly, it could have felt like they couldn’t trust anything that they knew.
Again, instead of instinctively judging people, it is worth thinking about how hard it would be to accept that something we had been authoritatively taught since we were at school, and which we have founded our career on, turned out to be wrong. We'd all like to think we would follow the evidence, but if we are honest we are more likely to act like doctors in the nineteenth century.
Embodied practice
Another common set of connections between different fields of knowledge is in our daily practices. If you stop and reflect on your regular routines, you will observe that you have incorporated a range of insights from fields like nutrition, medicine, psychology, religion, physiology, economics, and chemistry (e.g. cooking). All of these have been tied together with a range of things you have worked out for yourself into a fairly cohesive, often reinforcing, set of practices. If you were to discover that one of these insights was wrong, it would sometimes have ripple consequences across all of your routines and practices - but changing routine is hard. What else am I doing might be wrong? Or was I doing the right thing for the wrong reason?
The embodied and routine aspects of medical practice were dramatically confronted by Semmelweis' discovery. If he was right, doctors would have to dramatically change what they wore, their routines, how they interacted with people, and would have to relearn a lot of practical things. This is not only hard to do individually but would have the same sort of ripple effects we have identified elsewhere. It's much easier to change your mind if there are no practical consequences. If it requires relearning and reconfiguring all your daily practices, that turns out to be a big barrier to change.
Social dynamics
One more set of interconnections between different areas of knowledge is the social nature of humans and therefore what we count as knowledge. As humans, we belong to various social groups and consider ourselves to have various identities. Each of these comes with requirements for what we know, sometimes for good reason, other times less so.
For example, if you are part of the social group who supports a particular football team, you cannot really belong to that group unless you have sufficient knowledge of both football and the team. Likewise, if someone belongs to the group of people who fly airplanes, we rightly expect there to be a body of knowledge common to that group of people. Often the range of knowledge common to a social group is diverse and even eclectic.
Pilots will all have a range of common information about physics, bureaucratic procedures, human physiology under pressure, and how the natural world looks, that differs to other people. Social and identity groups typically use knowledge tests as a factor in deciding whether to include or exclude people from the group. This is often important and can play a critical role in standards and safety. If a group of pilots discover someone who doesn't seem to know much about flying a plane, it is better for everyone that they don't just treat them as a pilot.
These dynamics, however, explain why individuals and groups often find it hard to change their mind despite good evidence. Doctors in the nineteenth century had a body of knowledge that determined whether someone was a responsible medical professional. Much of this was about medical practice, but also included information on how a doctor would behave, charge for work and their broader place in society. Casting people who didn't adhere to this as irresponsible or quacks was, implicitly, important for quality control and to keep the standards of medicine high. As understandable as all of these are, they had many negative effects when the body of knowledge relied on turned out to be wrong. The social dynamics at play, which often have a positive reason for existing, made change and improvement impossible for many decades.
It is hard to change our minds
There are many factors at play in our knowledge that make it hard to change our minds. Some of these have important, positive impacts across much of our lives, but in certain circumstances they get in the way. This is not to justify not changing our minds, but to help us understand the dynamics at play and become more self-aware about why we might be going wrong. Understanding our human tendencies, and being humble about our ability to avoid them, is always a useful start.
Importantly, our reasons for not accepting some new information is likely due to a range of factors beyond this specific information. We rightly connect many of our different theories and fields of knowledge: the real world is messy and complicated and so our knowledge has to reflect that. However this means that changing our mind in one area requires changes in theories and knowledge across a range of other fields. Changing our mind also sometimes requires us to question authorities and types of reasoning we have trusted in the past, often for unrelated fields of knowledge. It is not just about the information in front of us now.
More broadly, our knowledge gets embodied and encoded into individual, and social, routines and practices. We rely on, and often reinforce, existing knowledge to ensure practical success, including quality and safety considerations. Where our knowledge is reliable, these are efficient, practical ways of helping ensure success. But they often prevent change or updating our knowledge if we have something wrong - a dynamic that has been long studied in the sociology and philosophy of science. It underlies Thomas Kuhn’s theories of paradigm change, or Max Planck’s observation that “Science progresses one funeral at a time” - old ideas often only die when their proponents do.
However, we are not doomed to always repeat these same dynamics. Understanding where they come from is a good start for avoiding the traps. I’ll write more on better epistemic practices in future posts.
Nice to know that Douglas Adams was right about the fundamental connectedness of all things. But you are now making me question whether he was right about being able to power a space ship with the energy created from settling a restaurant bill. Unhelpful.
Does this come partly back to the strengths and weaknesses of science, and the competing views of science as a voyage of discovery versus science as testing hypothesis?
Your early point about how much knowledge matters in day to day life is really profound. Could it be one of the reasons why some societal level problems simply drift unresolved? For any individual the consequences do not appear serious enough to compromise of their beliefs. It doesn't matter whether they are actually serious enough, just that they do not appear serious enough in their day to day living.
I also wonder whether some baser instincts are at play. Our place in society and our power within society often relies on us sitting comfortably within the belief spectrum that currently exists. So called disrupters are not disrupters knowledge but are disrupting the status quo within the range of beliefs.
We will never know but I wonder what might have happened if Semmelweis had described his discovery as an application of miasma rather than a rejection of it. Miasma is affecting the hands of through the agency of bacteria etc. I guess the question is whether there more or less ineffective paths for changing knowledge, given our general rejection of the 'eureka' moment of understanding so beloved of scientists.