6 Comments

Hmmm, an interesting thought. My sense is that your longer line of argument would suggest that decision making in all forms of government relies (unreasonably) on epistemic certainty. I wonder whether the problem of totalitarianism (in this sense) is less about the presence and level of epistemic certainty but the lack of contest between the thinking of different schools of thought (all of whom might have high levels of certainty within their own construct). The advantage of democracy is not that it is less epistemically certain but that it allows contest between these epistemically-certain schools of thought and therefore avoids the group think problem endemic in totalitarianism.

Expand full comment
author

In my mind, the fact democracy allows contest means that the design of the system as a whole assumes a level of epistemic humility, even if the players in it tend towards certainty (or at least confidence). Although that does raise the question about whether it is democracy itself that matters more here, or if it is the structures we have put in place to encourage contest (like the separation of powers and Cabinet decision making in Westminster systems).

Expand full comment

Yes. I had the structures supporting decisions in mind. My sense is that human beings are biased towards being certain (or at least confident) about their own judgements. I know many more people who are confident about their judgements on the world, than I know people who are genuinely epistemically humble. This adds to the problem of power corrupting (ie it is more than self interest asserting itself).

Expand full comment

Thanks for the kind wishes - much appreciated!

You put it succinctly –a dominant 'world view' can't envisage that it is wrong. This links with systems of education and training that assume, I hazard a guess, not so much that 'we' know everything, but that there is a credible method, the ‘objective’ method, for obtaining and propagating 'knowledge'. This has put a huge premium on relying on that curious place, 'the future'. A long ago friend once said he thought The Future was become a modern Religion.

Very recently Eric Topol has written a professional appraisal of the arrival of AI in medicine. The AI machinery/tech facility is growing exponentially, doubling in as little as every 6 months, led initially by huge corporate entities in the West, but with significant innovation in government supported institutions worldwide.

Leaving aside the common tendency to ignore the inevitable trajectory of any 'exponential growth', we see an 'arms-race' across all sorts of different fields, all looking at the role of 'knowledge', and focussed on methods for 'control' for the future, which in my view must relate to governance. The limit in the medical example for AI pattern recognition is seen to be the quality of large data-bases. For this AI approach to succeed, data acquisition in medicine and more generally will need to be 'totalitarian', which lends credence perhaps to Kingsnorth as well as your thesis?

Hagens talks about a scattered archipelago of discussion, and perhaps a different kind of knowledge, even wisdom. I agree with you about epistemic certainty and trouble. To my mind any certainty particularly about the 'objective' method means trouble. We have already an existing and trending disaster ongoing.

Expand full comment

I can only contribute a few notes in the margin of a massive subject. Even so, please forgive the length. I have been stuck on the sofa with my first ever sciatica, which I hope is some excuse. Smile.

Human kind has a longer history before civilisations started to come and go. Western thought has a relatively shorter history. Governance has had many attempts and attracted meta-theory. Chinese thought, for example, according to translation by Pound the poet, allowed a successful dynasty to claim foundation on a 'Great Sensibility'. China's agrarian population had devised heritable skill sets to maintain a broadly reliable, aka 'continuing', biophysical base, and early enough created a coherent Civil Service. Very recently, in response to the growth of a large urban population, an external Empire and the rapid replacement of an agrarian society by an industrial one, Britain, itself a fairly recent construct, created a Civil Service based on similar rules to Old China. It was staffed competitively from a highly educated ('classically educated') small class of 'generalists'. Politically, Britain was internally conflicted, but was able to ride the high horse of industry, mass population, innovative commercial interest and a changed food base. Externally, Britain imposed or attempted to impose a rule-based control over many diverse polities and traditional, agrarian, largely craft-based societies. Commercial entities were given special licence to operate competitively. Indeed ingenious, if ruthless rules and financial constructs allowed complex ventures to emerge based on shared systems for prioritising resource allocation.

Time has moved on. Britain was largely displaced. Kingsnorth the novelist & essayist argues that the now larger 'we' has created an out-of-control self-organising machine that could/will fully externalise the human knowledge base, and which already recruits humans using the 'sciences' of persuasion in the service of the Machine. On the other hand, Hagens the ecologist calls the sprawling organisation an Amoeba, it having no meta understanding, being organised entirely with a mind to exploit the one-off 'bag of sugar', aka, the buried ancient carbon stores of energy plus the resources to mobilise that energy. Hagens' conversation last week with the research physicist Tom Murphy discussed the nature of human knowledge, particularly the collective externalised knowledge we have acquired historically, its biology and its limits and likely lifespan. Murphy: “It's not in our DNA”. In my view they present with commendable humility a profound re-assessment of possible roles for governance. https://t.co/sxJWW9BvYz

Expand full comment
author

I hope your back and now sciatica are on the mend! Thanks for the link. To connect my thinking here to Kingsnorth and Hagens, one reason the machine/amoeba is out-of-control as they assume epistemic certainty - there is no option in their operation that what they are doing could be wrong. The real world, whether we are looking at ecology, sociology or physics, has a habit of surprising us and showing up how we have got things wrong - which we don't like to hear.

Expand full comment