This is a different sort of post that I’m trying out. Rather anything detailed, it is a short reaction to something I’ve seen. Interested in thoughts on whether these types of posts are worth doing.
The WSJ has published an article on disinformation that has got some attention and overlaps with our interests here. The author, Barton Swaim, traces back the current interest in disinformation to people who “sincerely mistake their own interpretations of the facts for the facts themselves.” In other words, they assume they have epistemic certainty, their knowledge is definitively correct. This strongly echoes my previous argument that an attitude of epistemic certainty lays at the heart of concerns about misinformation.
For a newspaper article with a partisan spin, I think his core argument is solid. However, Swaim misunderstands the influence of postmodernism. He sees a discontinuity between its epistemic relativism and the current cultural epistemic certainty. He has missed both the substitution of epistemic values for moral values that has taken place within critical theory and the belief that language constructs reality. If we take these seriously, especially as language is knowable in ways that an independent reality isn’t, then postmodernism or critical theory leads naturally to epistemic certainty.
Regardless of my quibbles, the article is worth a read and this link should give you access (even though it is normally behind a paywall).
To answer your question: yes, I think so. It is a nice way of testing and applying your thinking. The article itself is interesting. It is notably that people only tend to be concerned about the inputs to a decision, when they fell the actual decision being taken is wrong. If the decision is wrong, the inputs must be wrong.
To answer your question: yes, I think so. It is a nice way of testing and applying your thinking. The article itself is interesting. It is notably that people only tend to be concerned about the inputs to a decision, when they fell the actual decision being taken is wrong. If the decision is wrong, the inputs must be wrong.