During tumultuous economic times where important decisions need to be made, all leaders face reality: from a behavioural economics perspective, we’re fallible, easily confused, not that smart, and often irrational. We’re more like Homer Simpson than Superman when it comes to making decisions.
At least, that’s according to Dan Ariely, professor of behavioural economics at Duke University. Professor Ariely is one of many in academics, psychology and business to recognise that, by nature, we’re poor decision-makers. Psychologist and Nobel prize winner Daniel Kahneman supports this thesis. In his 2011 book Thinking, Fast and Slow, Kahneman points to the many cognitive biases* made when making decisions—from ‘Confirmation Bias’ (seeking information to confirm our preconceptions) to ‘Gambler’s Fallacy’ (thinking future probabilities are changed by past events).
Expand Awareness to Improve Decision-Making
Behavioural economists have developed many theories—combining economics and psychology and applying psychometric testing to understand better how we make decisions. However, rather than immerse ourselves in complex behavioural models, there’s a simpler (albeit not necessarily easier) way.
Taking the time to reflect on our life and leadership allows us to expand our awareness and uncover the barriers holding us back. Expanded awareness will enable us to understand why we’re fallible to self-sabotage in decision-making. And by better understanding ourselves, we can make better decisions at pivotal moments.
Leaders who want to position themselves to make sound decisions when it matters should explore questions like:
- Am I aware of my biases and how they affect my behaviour and economic decisions?
- If / when I am aware, how can I avoid being sabotaged by them?
- Do I have meaning in business and life and a goal whose vision empowers the people I lead and me?
The aphorism act in haste, repent at leisure is cold comfort for bad decision making. At Atosú, we understand the often lonely world of a key decision-maker. And we’re here to accompany you on your journey to better judgement.
*Below are some examples from Wilke & Mata’s paper “Cognitive Bias.” [1]
- Confirmation bias: The tendency to selectively search for or interpret information in a way that confirms one’s preconceptions or hypotheses
- Conjunction fallacy: The tendency to assume that specific conditions are more probable than a single general one
- Endowment effect: The tendency that people often demand more to give up on an object than they would be willing to pay to acquire it
- Fundamental attribution error: The tendency to overemphasise personal factors and underestimate situational factors when explaining other people’s behaviour
- Gambler’s fallacy: The tendency to think that future probabilities are changed by past events when, in reality, they are unchanged (e.g., series of roulette wheel spins)
- Halo effect: The tendency for a person’s positive or negative traits to extend from one area of their personality to another in others’ perceptions of them
- Hindsight bias: A memory distortion phenomenon by which, with the benefit of feedback about the outcome of an event, people’s recalled judgments of the likelihood of that event are typically closer to the actual outcome than their original judgments were
- Hot-hand fallacy: The expectation of streaks in sequences of hits and misses whose probabilities are, in fact, independent (e.g., coin tosses, basketball shots)
- Illusory correlation: The tendency to identify a correlation between a certain type of action and effect when no such correlation exists
- In-group bias: The tendency for people to give preferential treatment to others they perceive to be members of their own group
- Mere exposure effect: The tendency by which people develop a preference for things merely because they are familiar with them