A diversity of epistemic communities
An initial typology of how groups function around building and spreading knowledge

In my previous article, I starting looking at the social dynamics around knowledge through the concept of second hand knowledge. This is the knowledge that we get from others, rather than what we build ourselves. However, that article only took a small step into the social dynamics as it still focused on an individual knower. This post will take the next step and look at groups and their dynamics.
To do this, we want to consider different groups of people and look at how a group might function as an epistemic community with regards to how to it builds shared knowledge. As with the previous post, the primary aim here is to explore the relevant dynamics, rather than to make any judgements about what is, or isn't, desirable. This means we'll focus on understanding a few different types of epistemic community and see what we can learn from them.
Different epistemic communities
The previous article identified one critical variable in the acquisition of second hand knowledge - trust in the source. This same concept is obviously relevant to groups of people as the levels of trust between all the members is critical to how it functions as an epistemic community. Where there is widespread high trust (on matters relating to knowledge), a group will form a cohesive epistemic community as everyone is 'on the same page'. On the other hand, a group with low trust will not be cohesive, but need not be ineffective as an epistemic community. Groups like this will likely have ongoing, sometimes vicious, debates and arguments - which can be effective epistemic tools.
A second important variable is to consider the way information flows within a group. To simplify, we can focus for now on one key parameter - whether information is simply transmitted to other members of the group, or whether there are strong two-way flows of information (or feedback mechanisms) between members. Both ends of this spectrum can create viable epistemic communities. For note, the decisive factors about the type of information flow are often technological (e.g. books vs talking), capacity factors (e.g. one person can only have two-way information flows with a limited number of others) and societal or power structures (e.g. expectations on a teacher are different to those on peers).
These two variables - trust and information flow - create a natural set of axes, naturally leading to the consultant's favourite tool - a two-by-two quadrant matrix. You may have to forgive me for resorting to this tool, even if it is perhaps appropriate as I am now earning money as a consultant philosopher. As is standard - both in consulting and in philosophical arguments - we'll push each variable to the extremes and consider the four combinations. See below for a diagram.
1) Extended Mind: high trust within the group and uninhibited two-way flow of information between members
An epistemic community like this will be highly collaborative, on the same page, and highly cohesive. Members will share and receive further information regularly, and trust what they receive without much effort. This means that they will function much like one mind expanded over a group. It could look like a strong research team who trusts all the others in their work as they answer their research questions together. It could also look like a team of enthusiasts feeding off each other and egging each other on to crazier and crazier views.
2) Expertocracy: High trust but a limited two-way flow of information
This sort of epistemic community, where there is high trust but information is largely transmitted without any feedback in the other direction, will naturally be one where the transmitters are treated as experts in their domain. It might be highly hierarchical, like a lecture, or multi-dimensional where different people have different expertise. Either way, it looks like one, or many, teacher-student relationships where those receiving information trust the expert. The ability of this sort of community to increase the knowledge of the whole group is only as strong as the expert(s) involved: it can be students learning from a wise lecturer, or members of a cult blindly following their leader.
3) Debate Club: Low trust but high two-way information flow
If a group has low trust in others' credibility about knowledge but lots of interaction or information flow either way, then the epistemic community will be invariably structured around debates or (hopefully only intellectual) fights. This sort of epistemic community can be highly productive in the search for knowledge; it is the default structure of scientific conferences. But it also characterises the toxic point-scoring and MOBA dynamics that are common on social media.
4) Anti-community: Low trust and low two-way information flow
It might seem like a group that doesn't trust other members, and where information is only received as a transmission, cannot function as any sort of coherent epistemic community. It would look like a parent lecturing a teenager who has stopped listening, or an authoritarian leader making proclamations that no-one believes anymore. However, while a group like this is clearly not well set up to increase knowledge, it is still possible in some situations.
A humorous example can be found in the online financial meme sometimes referred to as the Inverse Cramer Effect. The idea is that American financial commentator Jim Cramer's recommendations are so bad that a great investment strategy is to do the opposite of what he recommends.1 The idea is that the way to good information is to deliberately mistrust an ‘expert’.
While this is more meme than reality, it does point to something interesting. We often build knowledge and information from people who we distrust and don't believe by considering why they said something and what might be the case that lead them to say it. It is the same dynamic we use to make sense of propaganda or spin - we try to read the subtext of what must be actually true if that person is saying that at this time.2
We can put all of these onto a typical quadrant diagram:3
Pay attention to what type of community you are in
This schematic typology provides an interesting framework for analysing different groups and societies. Focusing on the type of epistemic communities involved may give insights into how a group functions and how it might be shifted. It should be noted though that one group of people can function as different types of epistemic community in different circumstances, so any conclusions will always be context dependent. One good example is that when time pressures increase, or a crisis occurs, information flows contract and epistemic communities will shift rightwards on the axes above. We'll consider some initial observations here but will come back to this in future posts.
The first is that different concepts of authority or leadership create different types of epistemic community. A classic authority based, or 'command and control' system, sits clearly in the 'Expertocracy' model. This is often appropriate for high stress or dangerous situations, like in the military, where it is often deployed. However, it is a stark contrast to a typical democratic approach to decisions. That assumes the group sits more in the Debate Club quadrant where we need to debate things out to come to better knowledge and a decision. Both forms are legitimate in different circumstances, but we need to be aware that these epistemic communities will operate differently and have different norms.
A second point is that, I would expect, most people would assume that the Extended Mind is the most productive type of epistemic community. Having everyone trusting each other and communicating freely sounds like the ideal set up for doing amazing things. However, these communities can sometimes be dangerous and counter-productive. We have seen an explosion of these types of communities across the world in the last few decades but often, as they congregate on specific social media groups or forums, they are creating obsessive 'fandoms', various 'stans' or self-radicalising in different ways.
This leads into an important observation about this taxonomy. Each of the categories has examples that provide a positive environment for building knowledge, and examples that are toxic and dangerous. Neither trust nor information flows are sufficient to guarantee any real development of knowledge. There are, in my view, two related factors that characterise the difference between epistemic communities that are genuinely learning new knowledge and those that aren't. One is where the community, as a whole, sits in the list of epistemic attitudes. A group that thinks they already know things with certainty is more likely to end up as one of the negative examples.
A second factor is, to draw on the dynamics by which humans build knowledge, whether the group is regularly testing what they think they know against reality in some way. Without that ongoing testing, everyone starts losing touch with reality and 'knows' things that aren't actually true.
Finally, this taxonomy provides an interesting framework for looking at social changes over the last few decades. The rapid growth of communication and information transmission online means that epistemic communities (both in fact and as a norm) have shifted towards greater two-way communication. However, it is not clear that our mindsets, habits, methods of organisation or management styles have yet adjusted. This is a particular issue for academia, as I have written about elsewhere.
This concept was widespread enough that one company set up an investment fund based explicitly on this strategy.
This fits with my suggestion in the last post that we need to track information alongside the content of any statement if we want to understand knowledge dynamics.
Feel free to make suggestions on how to improve this, including the names of the quadrants.
Lots to think about within this matrix format: between dualities.
Nevertheless, standing back a bit I see 'groups' have actual and potential timelines. When I worked both across and within some civil service structures, we sometimes referred to 'institutional memory', which provided often a continuity of 'behaviour/knowledge' in the form mostly of an accepted or defined 'role' (i.e. 'context') and an 'accepted expertise' along with 'assumptions/attitude'. Membership included internalising and/or contributing to the institutional memory.
Given often group historical timelines, I would value identifying the many strands of philosophical assumptions that are commonplace, often dating back to antiquity, perhaps identifying their modern glosses; thus for example, 'utility', determinism', or what MacIntyre called 'emotivism', along with logical positivism et al. (I am not a philosopher, please forgive me spraying terms around.)
Similarly, 'information' is not always a useful category because it comes usually heavily contaminated with plausible factoids, (this is a 'scientific' age), built into an 'accepted' or 'acceptable' group knowledge base?
Likewise, 'attitude' to other groups' motives is often critical in directing group knowledge acquisition, and we are often enough presented with 'information' in something more like a complex interactive Venn structure than a quadrant?
This is great. Two dimensions represents a useful staring frame. But I wonder it is too abstract. Is there another dimension needed around 'purpose' (that what the information is going to be used for)?