I recently read the international bestseller “Thinking Fast and Slow” by the Nobel prize winner Daniel Kahneman. As I was reading the book a number of long running intractable issues regarding the resistance to uptake of evidence based decision making started to dissolve with each page of fascinating insight into how we think and make decisions. The book reveals that our minds are susceptible to systematic errors through the overriding influence of what the author describes as “System 1″or fast thinking. System 1 thinking represent comfort, familiarity, suppression of doubt, confirmation of existing beliefs, intuitive judgement and impulsive likes and dislikes, and operates within the paradigm framework each person has created for themselves. System 2 thinking on the other hand involves a greater level of ‘thought effort’ and is responsible for functions such as doubting, unbelieving, performing more complex mathematical calculations and deeper consideration of answers.
Much of the thinking we do, the responses we give to questions and even the questions we ask are driven by an innate desire to minimize effort.This “law of least effort” for cognitive and physical exertion may be an evolutionary legacy but a vestigial one and potentially harmful one in a complex world where we often need to process a lot of information in order to make the best decision. This is particularly relevant when we have the responsibility of making decisions that have long term implications on behalf of society ie decisions involved with environmental management. My experiences over the last ten years indicate that beyond the lip service of using ‘best available evidence’ and adopting ‘systematic evidence based’ approaches there is little desire to invest in the additional effort required in the practical application of evidence based decision making. Can it be that this is an artifact of our System 1 thinking?
Jumping to conclusions where there is little or no evidence of causality is example of System 1 thinking that we see frequently. There is a thunderstorm at your train station and the train is late – an automatic assumption is that the thunderstorm has caused the lateness of the train. Causality through quick inference requires much less cognitive effort than an analysis of observation or several observations. Evidence from observation however is much more reliable than evidence from inference.
In environmental management I often observe some of the common diagnostics of System 1 thinking. One of these is a tendency when asked a difficult question to substitute it for one or more easier questions. For example when asked about the likely causes for the increase in shark attacks on the Australian coastline, we often substitute the question for “how do I feel about shark attacks?” or “how can I reduce my chances of being taken?” It is critical that we ask the right and often the hard questions in environmental management and not to substitute these with questions for which we already have the answers or we know will have a favorable response.
The lazy control of System 1 may even be a driver for the non-believer or skeptic. If you truly believe in something then you feel a compulsion to take some action for example if you believe in anthropocentric driven climate change then you may feel the need to reduce your energy consumption in some way. The non-believer or skeptic however has no compulsion to change anything , it’s just business as usual, no change, no guilt, no worries.
Another attribute of System 1 thinking is that it is much easier to search and accept for information that confirms our existing beliefs rather than challenging these beliefs with new evidence. Again I have witnessed (and been a party to) the “we have always done this so let’s keep doing it” scenario. We have always grazed sheep in this way, controlled feral rabbits like this, run community environmental education campaigns like that or provided funding to these organizations are common scenarios where it requires much more effort to challenge the status quo. There is a body of theory around decision making in “chaotic” systems (where there is no consistent behavior in how the system operates), that suggests that prior experience is really the only way of trying to predict outcomes from inputs to a system such as environmental management actions. Many people consider that environmental systems operate in this chaotic manner and studies such as that of Pullin and Knight, 2004 show that the primary source of knowledge used by conservation planners (many would consider conservation planning as operating in a chaotic system) across the UK and Australia was “accounts of traditional management” with only 11% using primary scientific literature. Kahneman suggests that there is a strong tendency to accept information that confirms our existing beliefs – this creates a very worrying bias for evidence based decision making.
So can it be that the evolutionary traits of our thinking processes, the very traits that contributed to our success by enabling us to rapidly assess the presence of danger, to make fast judgements based on instinct and intuitive judgement and to conserve cognitive and physical energy are now those traits that are putting us in danger by applying an apathetic approach to decision making in a complex world? How do we encourage a generation of System 2 thinkers making evidence based decisions?