Misinformed about misinformation?
Understanding the factors driving a popular concept
One of the intellectual growth industries over the last 5 - 10 years has been around the concept of 'misinformation'. Not only has the term exploded in popularity, but there are increasing numbers of research projects devoted to understanding and combating it - often, but not only, through fact checking units. Misinformation (and related concepts like disinformation and malinformation) have been relied on to explain a wide range of social and political phenomena and events - from Brexit and the election of Donald Trump as US President, to the growth of the covid pandemic and the reluctance among many to get vaccinated (either with traditional vaccines or against covid).
Whether or not you find these explanations convincing (and many don’t), it is helpful to understand the conceptual and cultural factors that account for the appeal of the concept of misinformation. Why is it such a big issue now when lies, rumours, false accounts and conspiracy theories - the concerns tackled as misinformation - have been around as long as humans have communicated in social groups? As Jonathon Swift wrote (and many are attributed with something similar) “Falsehood flies, and the Truth comes limping after it.”
Why misinformation and why now?
In one, almost trivial, sense the answer is simple: the internet. The internet has fundamentally changed the way information is produced, accessed and digested - creating new and unexpected information and societal dynamics. Most simply, pretty much anyone can record and share information that is then available to everyone - with no clear indicators of accuracy or reliability.
Previously, information was transmitted either through existing social networks (friends, family, acquaintances, etc) or through more formal organisations like the media, schools or universities, churches, businesses, governments and the like. We could evaluate the information we received based on previously established levels of trust in more or less known entities. Now, with the radical democratisation of the means of knowledge production and access, the information we receive is often unmediated by tradition markers of trustworthiness and it is harder to know what to trust. Many have been deceived and suffered greatly as a result.
However, people being misinformed on a wide scale is not anything new. The history of political campaigns, government propaganda, con artists, even newspapers (especially in the 19th century) are full of examples of misinformation and deliberate disinformation. Yet why is it that, today, misinformation is considered to be a threat to our democratic systems, way of life or civilisation?
One clue is in the etymology of the word: I can only conclude that I was misinformed about something if I now confidently know what the correct information was. More broadly, it only makes sense to worry about misinformation, if we are confident that we know the correct information. To use the concept outlined in a previous post, misinformation can only be considered an important social issue when we have significant epistemic confidence: we are confident that we can and do know what is true on a wide range of issues.
Losing our epistemic humility
Culturally, there have been a range of important trends that have boosted our epistemic confidence, both practically and theoretically. While the history of human knowledge is very often a parade of wrong ideas, at least four important factors have converged over the past decade or two to convince many of us - at a societal level - that we no longer need epistemic humility.
The first is a practical psychological impact of the internet, and in particular the radical democratisation of access to knowledge it has created. Those of us who are old enough to remember a world without ubiquitous internet access and smartphones will remember that debates about the facts of a particular matter were common amongst friends or family. There was no immediate recourse to Google or Wikipedia to answer a question, so we had to rely on memory or find a reference book. Occasionally someone had the right book around but definitely resolving a question of fact would often involve significant effort, say a trip to a library. This was rarely worth it and so we lived without knowing or agreeing on the answer. At the individual and immediate social levels, we lived with an unavoidable epistemic humility as knowledge was often hard to establish.
Today, any questions of fact are usually answered on the spot by someone with a smartphone. This creates a huge psychological shift: questions of fact seem easy to establish and accurate information is easy to find. Whether the information we are finding is true - completely or even partially - may not be obvious but that doesn't change the psychological - and therefore sociological - experience. We have flipped from epistemic humility to a practical epistemic confidence.
A second, more social, factor is the widely commented on phenomena of 'filter bubbles'. While they have always existed to some extent, modern digital technology exacerbates the dynamics so that it is now possible - perhaps even common - to never experience any points of view that differ from our own unless we actively look for them. In a world where everyone we interact with and the news we read agree with our own views, confirmation bias kicks in and the confidence that we are right and our views are true necessarily skyrockets. The epistemic humility we used to learn through forced interactions with intelligent and informed people who disagreed with us has disappeared. Unsurprisingly, even if it is unjustified, filter bubbles and similar dynamics boost our epistemic confidence.
Importantly, there are strong intellectual trends that are also justifying an increased epistemic confidence. There has been a remarkable convergence around epistemic confidence from the major philosophical traditions within Western culture over the last century. I will necessarily over-simplify the story here and group these into two traditions and cover each very quickly.
One dominant philosophical tradition through the twentieth and twenty-first centuries can be broadly defined as confidence in facts and the reliability of science and scientific methods to definitively provide those facts. Academically, this tradition runs through logical positivism and various branches of analytic philosophy, while in broader culture it can be seen in a common disdain for anything considered to be unscientific or in calls to trust the science. Whether or not you agree with it, this intellectual tradition has provided a robust cultural epistemic confidence: we can know what is true and we just need to consult science to do so.
The other major intellectual tradition within Western culture has been what is known in philosophy as the Continental tradition. Broadly speaking, this line traces through a range of movements from phenomenology, existentialism through post-structural and post-modern thought to critical theory and deconstructionism of various forms. Concepts such as privilege and structural racism belong to this tradition. Despite an aversion to grand theories or meta-narratives within post-modern thought, this intellectual tradition has also reached a position of robust epistemic confidence. For the sake of brevity here, the interesting story behind this is in a different post. However, it is clear in current public debates that critical theorists or whose who are often called the ‘woke’ have absolute confidence in their conclusions, and quickly dismiss anyone who disagrees.
Remarkably, these two intellectual traditions - which are radically opposed in many of their positions - agree that identifying the facts or what is true can be confidently achieved and there are known ways to do it. This epistemic confidence is reinforced by our lived experience of just being able to look up facts on our phone and rarely coming across genuine differences in beliefs.
Epistemic confidence drives the focus on misinformation
If we grant that this epistemic confidence is justified, we are faced with the challenge of explaining the common real world phenomenon of different people (and groups) having strongly held but wildly different beliefs about what the facts are or what is true. How can we explain this without abandoning our epistemic confidence?
The charitable explanation is that those who disagree with what we see as clearly true are misinformed. (If we are less charitable, we might think they are deliberately malicious, plain stupid or the victims of others.) Given so many people are, on these assumptions, clearly misinformed then it is an altruistic and important social duty to correct that misinformation. It is therefore unsurprising that significant intellectual, financial and cultural resources have been directed to this task, most commonly through fact checkers.
Unfortunately, the track record of fact checkers and those countering misinformation is patchy. In short, the thorough-going epistemic confidence that underpins their operation is misplaced, especially for many of the important questions we need to answer. A good illustration is the challenge faced by various fact checking organisations through the pandemic where their conclusions over issues like the origins of covid had to change as the evidence shifted.
Establishing the facts about any situation, but especially about a novel virus that suddenly emerged without any existing research, is very difficult and labour intensive. Those providing fact checks or combating misinformation cannot do all this work themselves and so they rely on organisations that are perceived to have resources and don't have conflicts of interest: usually government organisations, NGOs and global bodies. Thus the facts about covid were narrowed down to what, typically, the WHO or the CDC had decreed.
Unfortunately, but unsurprisingly, these organisations changed their positions repeatedly (often but not always with new information), disagreed with each other and often seemed to act politically. Even messier was the state of the academic research on covid which remains over-supplied in quantity and under-supplied in quality. In this context, and under pressure to produce a definitive fact check (both by editors but also their own epistemic confidence) the temptation is always to trust the authority that backs up what you think anyway: it's easier in many, many ways.
This just shifts the debate about the facts to debates about which authorities to trust and so efforts to correct misinformation often just harden the filter bubbles they might have been hoped to break down. Those in different filter bubbles accept different core assumptions and treat different authorities as definitive. Thus fact checks within a bubble rely on the authorities those in the bubble trust and existing divisions are hardened.
While there are these challenges in practice, it rarely dents the confidence in treating these issues as a problem of misinformation. This is because misinformation, as an explanatory concept, is a powerful way to make sense of entrenched differences over various social and political issues while maintaining our assumed epistemic confidence. It allows people to maintain their confidence that the facts are easy to find and that we are therefore largely right in our views while maintaining a broadly charitable view of others if we accept that they are misinformed.
For those who don’t find the concept of misinformation convincing, it is important to understand the mindset of those who do - and the underlying epistemic confidence that is central to it. For those who are focused on countering misinformation, the practical difficulties in establishing definitive facts as illustrated through the covid pandemic should be a reason for reflection. The challenges are not because of a lack of people who are misinformed - clearly many are - but the overwhelming epistemic confidence with which many seek to tackle misinformation often undermines the task.