The fallibility of the mind

In his book Thinking Fast & Slow, Daniel Kahneman suggests that our minds are fundamentally lazy. Thinking requires attention and effort and these capacities are easily depleted. We prefer the comfort of the familiar and coherent to the discomfort of unfamiliarity and uncertainty. What matters is the narrative, and the best narratives are simple and coherent.

As a consequence, we tend, without being aware of it, to substitute easier questions that we can answer for harder questions that we can’t. This unconscious substitution is the basis of the heuristics and biases model that Kahneman developed with Amos Tversky. In their original paper they identified twenty such heuristics and their associated biases. In his book Kahneman shows how the idea can be developed to uncover the illusions that this creates and the over-confidence that it generates.

This account of the way we think leads to a kind of fallibilism rather than skepticism as such. It’s not that we can’t know anything, but discovery is difficult and the techniques and tools we use are error-prone.

Kahneman identifies here two of the components of fallibilism; those concerned with the mind and the tools and techniques it has available. I would like to add a third. I don’t know that it has a name, but it might be called the opacity of the real world. This is the idea that reality doesn’t show its structure or function on the surface. These have to be dug out. There is a double layer of archaeology required; not only an archaeology of knowledge, in Foucault’s phrase, but also an archaeology of the terrain.

This means that there are three components that go towards the construction of fallibilism:

i.    The fallible mind

ii.   The limitations of the mind’s tools and techniques

iii.  The opacity of the real world

Kahneman is primarily concerned with the weaknesses of fast intuitive thinking but slow deliberative thinking has its own limitations.

In an earlier post I argued that rational thinking was also based on the construction, development, application and interpretation of heuristic models. These models are heuristic in the sense that they are designed to accomplish a limited but useful task without making a claim to be unequivocal or complete.

In science and technology these models tend to be quantitative and mathematical. In other domains they tend to be built out of concepts articulated through language. Maps, graphs, charts, tables, diagrams and scale models are alternative methods of rendering a model. In every case the purpose is to build a working model of a target domain.

We therefore shouldn’t overplay the differences between fast and slow thinking. As Kahneman concludes in his book, intuitive thinking gets more right than it gets wrong. Intuition has access to a vast repertory of skills. However, although it can register cognitive ease it has no warning system to detect that it is reaching its limits and is becoming unreliable. There is no easy way to distinguish an intuitively skilled from a heuristic response.  Little can be done except get better at detecting situations where fast thinking will go wrong.

On the other hand, attentive deliberative thinking is who we think we are. It is rational but it is lazy and has a limited capacity for attention, limited knowledge and it makes mistakes. However, while both are error-prone, slow thinking contains the error detection and correction capabilities that fast thinking doesn’t have.

Although organisations are better than individuals at avoiding cognitive illusions, the institutional framework brings with it its own difficulties. There are the problems of funding models, bureaucratic hierarchies and the inevitable politics and polemics of institutions; the dangers of group think and a lack of diversity and, at the same time, the dangers of over-specialisation, that is, of diversity without constructive engagement.

In this three-part model the first component, the fallibility of the mind, is cognizant of the ambiguities and errors of perception, introspection, memory while the second is cognizant of the failures of method, of rationality and also of testimony.

Presentations of epistemological and psychological limitations tend, I think, to forget the third element, which is more ontological. Reality doesn’t carry its structure and the manner in which it evolves on the surface. There is no particular affinity between the mind and the world which will allow it to bypass its own fallibility or the limitations of its methods, tools and techniques.

Our experience is inherently noisy. The perspectival nature of experience and thinking means that the neutral point-of view has to be constructed. And, just like any other construction, that takes time, cooperation and coordination, and trial and error.

Where this kind of fallibilism is sceptical is with regard to over-confident claims to knowledge. These can come from many different directions. For example, in contemporary debates, we often have on the one side a scientific and technological over-confidence and, on the other, the certainties of intuition and testimony that underpin, in so many cases, religious thinking.

On both sides there is a reluctance to acknowledge the opacity of the world. The belief in intuition as reliable knowledge often goes with the idea that there is a reality beyond or below the appearance of the world with which the mind has an affinity and which it can know intuitively because in some way it is identified with it.

This is explicit in the what Aldous Huxley called the perennial philosophy and its identification of the ground of being with consciousness. My feeling is also that what the Roman Catholic church criticises as relativism is really this kind of fallibilism, the denial of the possibility of absolutely reliable knowledge. It is also inherent in the idea of the vita contemplativa, the ideal of knowledge as something acquired by stilling the activity of the mind. As Wordsworth expressed it in his poems in Lyrical Ballads:

“The eye it cannot chuse but see,             “We cannot bid the ear be still;
“Our bodies feel, where’er they be,            “Against, or with our will.

“Nor less I deem that there are powers,   “Which of themselves our minds impress,
“That we can feed this mind of ours,        “In a wise passiveness.

“Think you, mid all this mighty sum         “Of things for ever speaking,
“That nothing of itself will come,              “But we must still be seeking?

Similarly, scientific and technological confidence is often grounded in something that Erwin Schrödinger called the hypothesis of the understandability of nature; the idea that, in principle at least, a complete and unequivocal understanding of nature is a realistic objective for science. This is the underlying assumption of scientific realism.

From my perspective these look like different versions of the same error. If Kahneman is right, the reach for certainty is a weakness of the fallible mind.

Rationality can’t simply be seen as a means to counter ignorance, prejudice and superstition. It has to be also critical and reflexive and conscious of its own limitations if it isn’t to fall into the same trap of over-confident assertion.


Daniel Kahneman: Thinking, Fast and Slow. 2011: Penguin.