How to harness group dynamics for epistemic good
What can societies and organisations do to ensure they keep expanding what they know?
My writing here has tended to follow a typical philosophical path when trying to understand knowledge - the focus has been on how knowledge works for individuals. However, this can only ever be part of the story as we humans live as social creatures and most of what we count as knowledge has a social origin - we learn it from someone else. These social dynamics often exacerbate the best and worst of our human tendencies. As individuals, there are lots of reasons why we are reluctant to change our minds, but these reasons are multiplied when we take into account group and social dynamics.
To simplify a little, any society, organisation or group of humans needs some uniformity of beliefs, outlooks and/or goals for it to function even as a minimally cohesive group. Uniformity promotes stability, an increased ability to function coherently and often promotes efficiency of effort towards goals. So there are strong incentives for groups to monitor and enforce some set of orthodoxies - which naturally increases the resistance to the discovery of new knowledge and information.
At the same time, ensuring that what we think we know is, in fact, accurate is important for us in the long run to survive and thrive. The less accurate what we believe is, the more likely we will be to get something wrong and have things go badly. This holds whether we are talking about individuals, families, organisations or societies. So we should want our organisations and societies to encourage people to build and discover knowledge.
Both of these goals are important, and there will always be a tension between them. My interest here at Humble Knowledge is in improving our knowledge and therefore ensuring we have strong epistemic practices. The challenge, however, is that social dynamics, combined with the many reasons why people are naturally not inclined to change their minds, mean that the default position for groups is to coalesce around an orthodoxy and not be open to building knowledge. Given this, what can and should societies and organisations do to remain open and committed to improving their knowledge? The analysis I have been building offers some useful places to start. Most of these are not new, but the reasons for them are often not well understood.
Foundational principles
The starting point for ensuring a society or organisation will encourage the successful pursuit of improved and new knowledge is to remain humble about what we currently know. One of my core arguments here has always been that it is important to accept our limitations and recognise we often get things wrong. But remaining humble also provides a powerful incentive to try to know more.
If we are certain, or even highly confident, our knowledge is correct, then there is no reason to try to improve - and many reasons to enforce the social or organisational orthodoxy. On the other hand, if our epistemic attitudes fall to the other extreme - a deep skepticism where we don't think genuine knowledge is possible - then there is no incentive to look for better knowledge. In response, we will again tend to prioritise group dynamics and coherence as these still matter even if knowledge doesn’t. So epistemic humility - a genuine acknowledgement that what we know might be wrong - provides a genuine drive to improve our knowledge, and is the only foundational epistemic attitude that does so.
One core reason why we need to be humble is that there is always a gap between what we think (our theories, beliefs and assumptions) and how the world actually works. This means that organisations need to be externally, or reality, focused and ensure that what happens in the real world determines what beliefs and theories we accept. This might sound obvious but is far from easy to do in any group of people.
It is very common that people let core beliefs, or methods, or proxies decide what they accept as knowledge. Often we let our core beliefs determine what counts as a fact. Or we end up focused on achieving a metric, and forget to look at whether the metric accurately reflects the real world. To pick one another common example, the fact that something is published in a top peer-reviewed academic journal is not a reliable indicator that it is true, yet we often take it to be.
Encourage different thinking
The challenge we have is that these foundational principles tend to undermine group coherence and uniformity. To ensure they have strong epistemic practices, organisations and societies need to embed strong norms, habits and culture that embody humility and enable people to be externally focused. Given the inherent instability and tension at play, none of this can be successfully encoded in rules or processes alone.
The most famous relevant principle is to encourage a broad freedom of thought and speech for the reasons set out in JS Mill’s famous argument. Allowing, and exposing ourselves to, competing views ensures we don't take our knowledge for granted and it gets regularly tested. Even if our views are correct, having to face challenges helps ensure we improve our arguments and understanding of all the issues. We can't be lazy when there are live competing views being taken seriously.
However, formally allowing freedom of speech often has limited effect on the genuine range of views we encounter. Social and organisational dynamics ensure that many views are hard to express, or even formulate. Ensuring we get the advantages that Mill articulated requires some additional practices.
One useful practice is to actively build tests or opposition into your system. We have set up parliaments and legal systems on an adversarial model because it is meant to ensure all the arguments, knowledge and ideas are subject to genuine debate and critique. We take many decisions to committees representing different views and experience for the same reason. If need be, it can be worth setting up 'red teams', 'Devil's Advocates' or deliberate contrarians in your organisation for the same reasons. All of these have costs for group coherence and stability, but are important for ensuring our ideas and knowledge are tested and more accurate. The benefit is that group dynamics will ensure everyone works to a higher epistemic standard, but it only sticks if organisations actively reward people for taking a different view and prosecuting it.
Similarly, it is often important to tolerate or even encourage some eccentrics in your group. The more uniformity you have in your society or organisation, the more likely you will be to get trapped in a false orthodoxy. So making space for and encouraging unorthodox thinking and views will often have benefits for improving our knowledge, even if the eccentrics are always wrong (although they often get some things right).
Care and civility
The principles and norms articulated so far may seem to encourage chaos and risk splitting societies and organisations through various forms of conflict. However, taking a humble attitude to knowledge seriously, and embedding it as a norm, has important, and positive, consequences for group dynamics. If everyone recognises they could be wrong about what they know, then it changes our attitudes to people who disagree with us. If there is even a remote chance they are right, then I should give them a hearing and take their ideas seriously.
Importantly, remaining reality focused gives us a common basis on which to discuss and decide who is more likely to be right - what matters in the end is whether our knowledge matches the real world. Genuine humility therefore encourages respect for others and civility in how you relate to them. Unfortunately, this is often not natural for humans and we cannot design a system on the assumption that everyone will be genuinely humble. Instead, we want to establish norms that embody this attitude, even if individuals don't.
One good approach is to design for, and model, civil debate. Whether it is in a parliament, an academic conference, or around the dinner table, we want to encourage people to debate ideas in a reasonable and civil way so that our knowledge and ideas are tested. For when debates are focused on the ideas, and how well they match reality, rather than attacking people, we have a much greater chance of uncovering truth and improving our knowledge. It might sound old-fashioned to some, but manners and decorum are important for any organisation to function effectively for building better knowledge. Acceptable manners that take the focus off people and onto ideas may differ substantially across organisations, but they matter - even on social media.
A second good practice is ensure people take time for proper consideration before making decisions or taking action. Epistemic humility reminds us that our ideas, especially our initial ideas, are often flawed and we need to test them before acting. So taking time helps to improve the quality of our thinking, even if it feels like it is for the sake of appearances or adhering to process. People often feel like they don't have the time to reflect as they just need to get on and do things, but taking some time for this always matters.
For example, there is a well known decision making approach that came out of the US Air Force called the OODA Loop (Observe, Orient, Decide, Act). This initially captured, in short hand, the steps pilots should be rapidly taking in aerial dog-fights. The essence of the OODA Loop is to take time to think (OOD) before you act - which is a practical statement of epistemic humility. If there is time to do this in the middle of aerial combat, then there is time to do it anywhere.
Given our core challenge to good epistemic practices is the enforcement of group orthodoxies, it might seem like we should prioritise ideas that are new and promote change. However, remaining humble about what we know doesn't just mean being humble about the things we have known for a long time, but also being humble about the things we have just discovered. We have probably all had a brilliant new idea that we thought solved something, only to discover later that it wasn't actually very good.
So another important norm that embeds practical humility is to respect tradition and experience, but not be trapped by it. Those who have come before us have normally faced the same questions we have and found a set of knowledge and practices that reflected the world well enough and worked for them. This may include embedded knowledge that is reflected in practices without ever being recorded or clearly articulated.
Being humble about our knowledge means we need to take the knowledge and ideas of the people before us seriously and seek to understand them, rather than just throwing them out in favour of our latest idea or discovery. Intellectual fads and fashions are just as much an expression of false epistemic confidence as not changing anything and sticking to the existing orthodoxy. What matters are the arguments, evidence and details about whether the old or new ideas and knowledge are a better description of reality.
Few of these principles and norms evolve naturally in groups and organisations. They need to be modeled, encouraged and rewarded. I'll be interested in readers’ analysis of our current situations against these principles in the comments. Moreover, this isn't an exhaustive list of important principles and norms that naturally flow from taking epistemic humility seriously. Do let me know if there is anything you think I have missed, or where I have got things wrong.
Excellent. The principles have the beauty and power of simple common sense.
The OODA example is provocative. The training of pilots presumably focusses on undertaking the thinking steps very quickly in circumstances where a lot is at stake. In broader life, people who do this type of thinking really well are often said to have good instincts. It is almost as if some 'magic' involved in the thinking-acting process that others can not replicate.
To get this result, I wonder if OODA type models involves two thinking dimensions. One involves 'preparation thinking'. This is the thinking and training needed to build good fast-paced OOD. My understanding is that training is used to develop a set of heuristics along the lines of 'if this, then that'. This creates an programmatic element to the thinking, which is needed to create quicks links between OO and D and A. My feeling is that preparation thinking represents an effort to create certainty from uncertainty, rather than encourage humility.
The other dimension involves 'override thinking' which is done 'in the moment'. This is the thinking needed to make good decisions when the heuristic leans one way, but something doesn't quite add up. This too involves trained reactions, but involves more humility as a base (ie an acceptance that things do not always add up).
OODA is an example of individual thinking. The pilot does the thinking, makes the decision and takes the action. The process has strong collective backing (training plus information being fed to the pilot) but it is not really a collective process. While useful, I am not sure that it quite makes your point.