When I was told that we were starting a book club at DiUS, I was extremely excited. Almost hilariously so. I am a voracious reader; it’s just about my favorite thing to do. And while I don’t need to share my opinions as part of my reading experience, I was attracted to the prospect of a DiUS book club because it connects to one of my other passions: learning new things.
For the inaugural get together of DiUS book club, we were not going to be delving into fanciful narrative. This would have been fine and extremely enjoyable, but instead we voted to go straight to neuroscience. We were to read, and then discuss, the bestseller “Thinking, Fast and Slow” by Daniel Kahneman. What I like to call an ‘important book’. And I knew the discussions about what we thought of the book and what we learnt would be amazing, because of the people at our book club.
One of the things I like about working at DiUS is that people are interested in trying different approaches, sharing opinions and having random conversations about something they’ve come across and think is cool, or not cool as the case may be. Sometimes we agree, sometimes we do not. But something is always learnt. People are respectful. Nothing is dorky; everything is interesting.
We all have dual-purpose brains
“Thinking, Fast and Slow” was a dense book. No sugar coating, it was a tough read. Not because it was dry, in fact it was an interesting meander through Daniel Kahneman’s nobel prize winning career through a nostalgic lens of research projects and experiments, the people he worked with, and his subsequent discoveries about human decision making. It was a tough read because it was complicated. Neuroscience-complicated. You did have to take a break occasionally and digest.
The book introduces us to the inner workings of our brain and how we all have two types of systems that drive the way we think and make choices. On the one hand, we have the intuitive, emotional and always on System 1, fast thinking, that makes the majority of our decisions. On the other hand, there is the more effortful and logical System 2, slow thinking, that makes judgements on the basis of critical examination of evidence.
System 1 drives our instinctual, subconscious responses, the primitive survival lizard brain thinking that in days past saved our lives. It’s behind our ability to recognise faces and understand speech in a fraction of a second. System 1 tends to be our default and although it does have access to a big store of memories that it uses for decisions, judgements are made on the basis of “what you see is all there is”.
Critical and thoughtful system 2 is engaged for more complex tasks: maths, filling out forms, reverse parking. As system 2 requires more effort and consumes more resources, we only make the mental effort when we really, really need to. The level of concentration required to engage system 2 also dims down our ability to notice things in our immediate environment (see here for a great example), which can be dangerous.
As an aside, I now completely understand why I can’t have anyone in the car talking while I am turning into oncoming traffic. My effortful system 2 thinking is engaged, consuming all my resources and I simply cannot cope with any other distractions.
We think we’re smart, but we’re easily fooled
Kahneman identifies about twenty cognitive biases that influence our thinking and our decisions, whether we are employing System 1 or System 2 thinking. The ones that particularly struck a chord with me were in the book’s section on heuristics and biases; the mental shortcuts or rules of thumb we all have to help us make decisions faster.
The ‘anchors’ bias describes the notion that that when estimating, we are influenced by the number we heard before. For example, If you consider how much you should pay for a house, you will be influenced by the asking price. ‘The law of small numbers’ explains our willingness to make sweeping generalisations from an extremely small sample size, without even thinking about whether these conclusions are reasonable. The ‘Causes Trump statistics’ raises the fact than when confronted by hard statistics, this does nothing to change our view of the world or the decisions we make.
These heuristic biases also highlight my propensity to blindly accept statistics at face value because I’m really bad at maths and even when I evoke System 2, effortful thinking is not enough to unlock how the number was derived.
As the book works through these biases, you gain a deeper understanding of how our implicit psychological vulnerabilities can be exploited for others gain, or ultimately, ‘to sell more stuff’. However Kahneman also raises how these vulnerabilities could also be used in better informed government policy forming and implementation, with the goal to ensure justice, fairness and accountability within society. He references nudge economics, the somewhat controversial notion that the decision making of individuals and groups can be influenced to achieve non-forced compliance as effectively as legislation or enforcement. This lead to an interesting discussion about how your decisions and choices have impact on others and indeed, there can be a case for making decisions for everyone for the ‘good of the many’, e.g. society as a whole.
And our dual-purpose brains are filled with self-delusions
For me, it was the notion that System 1 and System 2 are not interdependent that was most impactful. You are not one or the other. You are not a logical person or a snap decisions person. You are both. Somewhat pessimistically, the book reveals that we are innately instinctive and highly susceptible to influence from our environment, and these inbuilt biases distort our decision making and our ability to assess risks, whether we are in fast or slow thinking mode. “System 2 articulates judgements and makes choices, but it often endorses or rationalises ideas or feelings that were generated by System 1”.
So, you can make a completely biased decision even when you think about it more deeply, if the decision or biases at work make logical sense to you. Does this mean that we are all deeply flawed and the dice is loaded from the start? Well yes, human reason definitely has some limitations. And it does explain human irrationality in the face of hard evidence and why we are so bad at risk assessment. It also upholds my long-held secret belief that random factors and luck have a lot to do with success.
But there’s still hope …
However, the penultimate conclusion of our perspectives was more optimistic: like Kahneman, our book club decided you can work against our unfortunate limitations to make better decisions. Awareness is key, of yourself as as well as all the biases, just as important is learning critical thinking through education and practice. Was this our bias at work? Well, I guess we’ll see.
A final observation is that reading the book on your own versus reading it at book club can perhaps be viewed as a sort of system 1 and system 2 interaction. When you read the book on your own, you are limited to your judgement and you are more likely to engage with system 1. However, when you discuss the book, I feel you are wholly using a more effortful and critical system 2 thinking.
DiUS book club definitely lived up to its potential; our discussions were a lot of fun and I learnt a lot. The complexity of the book meant I definitely got more out of tackling it as a group than if I had read it on my own. I would recommend “Thinking fast and slow” to anyone who is interested in a deeper understanding of how biases are formed and decisions are made. But give yourself a lot of time to read it, the concepts require further thinking, a lot of system 2 thinking.