Jumping to conclusions

Intuitive reasoning is unreflective reasoning, thinking that happens without the intervention of the critical mind. In his book Thinking Fast and Slow, Daniel Kahneman calls this fast thinking, tending to reserve the term intuition for instinctive expertise, the choice of terms reflecting a differing assessment of the rationality of these capabilities. However, without trying to come to a view on this question, it seems to me that we can regard them as modes of the same ability, our capacity to make judgements and decisions without conscious reflection which, for that reason, become judgements and decisions that we may not be able to explain to ourselves. 

With Aaron Tversky, Kahneman developed the heuristics and biases model of unreflective thinking. Such thinking is carried out through the association of ideas. It works through a set of heuristics that are grounded in resemblance, proximity and the availability of information. At the same time, these heuristics also contain inherent biases that lead to cognitive errors. In their original paper, Kahneman and Tversky identified twenty such heuristics and their associated biases. In his book Kahneman shows how the model has been extended and developed to understand the illusions that fast thinking creates and the over-confidence that it generates.

Fast thinking is primarily a storytelling mode which automatically hunts for causes and intentions and has little understanding of logic and probability. What matters is the narrative, and the best narratives are simple and coherent. Associative reasoning works by skating over gaps in information without realising that they are there, something Kahneman refers to as “what you see is all there is”. It is a machine for jumping to conclusions.

The underlying premise is that our minds are fundamentally lazy. Thinking requires attention and effort and these capacities are easily depleted. As a consequence, we prefer the comfort of the familiar and coherent to the discomfort of unfamiliarity and uncertainty. We tend, without being aware of it, to substitute easier questions that we can answer for harder questions that we can’t. It is this unconscious substitution that is the basis of the heuristics and biases model.

The heuristic and biases model challenged then prevailing assumptions that human beings are mainly rational and that emotions such as fear, affection and hatred explain most of the occasions when people depart from rationality. Although Kahneman concedes that emotion, liking and disliking with little deliberation or reasoning, looms larger in his thinking now than it did originally, the heuristics and biases model demonstrates that simply mastering one’s emotions would not by itself be sufficient to avoid cognitive error. There are unavoidable mistakes in the way the mind works in the same way that vison is subject to unavoidable optical illusion.

Despite this, the idea that fast thinking is irrational shouldn’t be overplayed. Intuition gets right more than it gets wrong. Further, although fast intuitive thinking is error prone it is error prone in a systematic fashion. If you know what the heuristics are, and the biases they are prone to, you can correct for them. The terrain of cognitive error has been extensively mapped and a set of diagnostic labels developed that can be used for detection and correction.

One alternative to the heuristics and biases model is the naturalistic decision making model. This approach takes a more positive view of the possibility of unreflective rationality. In particular, it argues that the intuitions of experts can be better explained by prolonged practice. Herbert Simon called this intuition as recognition:

The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.

The model also makes use of Simon’s notion of satisficing, the idea that a solution may be good enough without being the best possible. Firefighters, for example, must respond quickly to developing situations and prolonged deliberation may not be possible. With expert intuition there is a high degree of probability that a good enough solution will be arrived at without prolonged deliberation of multiple options.

In the right environment it is possible with training to generate skilled intuitions and therefore execute skilled intuitive responses. Kahneman engaged in an adversarial collaboration with Gary Klein, a leading advocate of naturalistic decision making, to try to map the boundary between the successful acquisition of expert intuition and the flawed intuitions of associational thinking. They agreed that what is needed is an environment that is both sufficiently regular to be reasonably predictable and one where there is the opportunity to learn these regularities through prolonged practice accompanied by immediate and accurate feedback.

Even in the case of expert recognition though, the basic question of error detection remains. How do you find the limits of the validity of intuitive judgments and decision making when intuition is unreflective thinking carried out below the level of conscious introspection?

Intuition may be always unreflective but it is not necessarily damagingly so. The heuristics of associational thinking are corrigible and, in the right circumstances, the intuition of experts can lead to better judgements and decisions than a more self-conscious deliberative process.

There is another account of intuition as recognition but in this version, intuition is essentially passive rather than unreflective. As expressed by Wordsworth in Lyrical Ballads:

“Nor less I deem that there are powers,
“Which of themselves our minds impress,
“That we can feed this mind of ours,
“In a wise passiveness.

What underlies this way of thinking is the idea that fundamental reality is something that can only be received by the mind, not something that can be actively discovered, that the movement to understanding originates in the object rather than the subject, and that therefore only when action and thought have been stilled can this reality be apprehended. In this reading, cognition obstructs recognition.

The validity of this way of thinking depends on the validity of the metaphysical assumption that there is an active principal in reality, that nature is trying, in some way, to be make itself known to human understanding and that the human mind has a supporting role in this drama and can only get in the way and obscure the truth when it tries to do something more. Heidegger was probably the most influential proponent of this idea in modern philosophy.

I don’t think this is a tenable idea today but shorn of the metaphysics I can understand the appeal. It is a reminder that there is a value to patient observation, to ensuing that we don’t let ourselves get in the way of the view or impose our own subjectivity on nature.

Fast thinking draws no clear boundary between theory and practice, between judgement formation and decision-making. Recognising such a separation requires more conscious reflection.

Errors of judgements lead to errors in decision making. Following on from their work on judgement, Kahneman and Tversky developed prospect theory as a descriptive psychological theory of decision making. The prevailing model of the rational agent, used particularly in economics, is based on postulated axioms of choice rather than observation. Behavioural economics is the outcome of attempts to bring psychological insights to economics, to take account of worry, regret, blame, anticipation and disappointment and so on in decision making.

The most significant contribution of psychology to behavioural economics is the concept of loss aversion. The idea is that, when directly compared, losses loom larger than gains in our mental accounting. Linked to this concept is the idea that the basis of evaluation is a reference point or adaptation level, which may be the state now or an expected future state. Loss aversion functions as a gravitational force that holds our life together near a reference point.

In the axiomatic rational agent model, outcomes are weighted by their probability. However, this turns out to be poor psychology. Emotion and the vividness of potential outcomes influence judgements of probability, which can lead to an excessive response to rare events and, in their absence, to neglect and inattention.

Salience is also affected by how the decision is framed. Axiomatically rational agents make decisions about their preferences that are not affected by the words used to describe them or the context in which they are presented. Framing effects are the unjustified influences on decisions of the formulation of beliefs and preferences, leading to decisions that become choices from description rather than choices from experience.

As with judgements, the irrationality of intuition decision making can be overstated. It is not unreasonable, for example, to allow decisions to be impacted by emotional experience and a wider conception of well-being than assembling a consistent set of preferences.

Behavioural economics was developed in opposition to a particular conception of the rational agent that is prevalent in economic theory. Was it perhaps easier to call into question this model by suggesting that real life judgements and decisions were irrational rather than by arguing that the concept of rationality being deployed in economics was mistaken?

The more important issue in my view, is what, if intuitions are frequently mistaken, is the significance of intuitive thinking for the idea of the integrity of the self and what risks does it create with regard to alienation from the reality of existence.

Kahneman, D. (2011). Thinking, Fast and Slow. 2011: Penguin.