“I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts”The Adventures of Sherlock Holmes – A Scandal in Bohemia, by Sir Arthur Conan Doyle
System 1 and System 2 thinking
We use two different modes of thinking throughout our daily life. The pshycolgists Keith Stanovich and Richard West referred to them as System 1 and System 2 thinking.
- System 1 is an autopilote mode that occupies the vast majority of our thinking time. You are reacting to a car rapidly approaching? System 1 makes you side jump to avoid the danger. You are answering a 3×2 equation? System1 provides you the answer with no effort. It uses our models and believefs to rapidly provide interpretation of the reality.
- System 2 is a different beast. It’s a conscious effort to apply your reasoning to a specific matter. Examples are solving a system of equations, or answering 32749/5172. Our beliefs are tested to making sense of the world.
|System 1||System 2|
|Fast, emotional||Slower, effortful|
|Automatically activated||Consciously activated|
|More Prone to biases||Rational, analytical thinking|
|WYSIATI (What you see is all there is)||Explicit selection and evaluation of facts|
|Large mayority of our thinking time||Small part of our thinking time|
|Fight or flight response||Logical and pondered reactions to threats|
System 1 is more easily prone to biases. The most common and relevant cognitive biases in agile decision systems are:
- Anchoring, when we rely exclusively on a self-selected subset of data
- Confirmation bias. We judge relying on our beliefs, discarding evidence.
- Loss aversion. The disutility associated with giving up something (i.e. a mental model) is greater than the perceived utility of acquiring a new one
- WYSIATI (What you see is all there is). The assumption that we already know all the needed information
- Optimism bias. Discarding/overestimating the probability of undesirable/favorable outcomes
- Not invented here. Aversion to models and opinions from the external
- Group thinking. The fear of disagreement lead to irrational decision making
- Halo effect. Previous judgments of one person in one area influence one’s judgements in another area
See the Nobel prize Daniel Kahneman, Thiking Fast and Slow, for more details on the two thinking modes, and a more comprehensive list of cognitive biases.
Mindsets and behaviors
We can consider our mindsets -the set of our values and assumptions- as the operating systems driving our behaviors. Our mindset helps us make sense of reality, conditions our behaviors, and to achieve related results.
While the vast majority of our thinking involves a mix of the two systems, nonetheless the two modes of thinking foster two different mindsets, in challenging situations, we tend to default to system 1, without even being aware of it. We often publicly espouse a system 2 mindset, but we act from a unilateral control one. That’s why it’s difficult to consciously change our mindset. The good news is that others can recognize the mindset we are acting from and they can help us become aware of it
System 1 at work: it starts from facts and observable data, then we select some details and interpret the data coherently with our beliefs. Finally we act accordingly.
System 1 thinking happens unconsciously, in fractions of second. Only the start and the end are visible to others. The core of the process – the blue part – rests unseen, unquestioned, untested
The missing ingredients: Advocacy, Inquiry, and Practice
Biases exists even in agile organizations and individual that espouses opennes and transparency values as foundational.When these biases are activated, they generate a defensive reasoning mode, meaning we become reluctant to discuss issues and we don’t say openly what we are really thinking. Mental models are not challenged anymore, and faulted untested decisions drive organizations towards unwanted paths. When a win-lose mindset pervades communication and conversations between team members, then trust, learning, accountability and the generation of value are prevented, blocked or disminished.
So why don’t we simply choose the mindset we want? The problem is that a) people are not typically aware of the mindset they are in, and b) switching to a new mindset requires effort. The challenge is not peculiar of agile teams only: it’s estmated that outdated, untested mental models cause 87% of the stall in growth of Fortune 100 organizations.
Now, why is that relevant in an agile transformation context? We should be aware that in challenging situations we commonly switch to System 1 thinking and defensive reasoning that limits opennes, mutual understanding and learning. Instead, we should learn how to be aware of it, slow down our automatic system 1 thinking, test its underlying assumptions and simultaneously explore other points of view.
Agile transformation programmes should foster the adoption of an advocacy-and-inquiry-based mindset, alongside with building different, more agile processes. For agile teams to be succesful, we need to be transparent and explicit in order to test and share our hidden assumptions, allowing others to view and understand them. In order to do so, explicit practices must be in place.
We must dedicate time, effort and skillfull training to practice and grow a transparency-based mindset to drive the appropriated behaviors, recognize biases and foster lerning in individuals, teams and the overall organizations.
Thus, as scrum masters and agile coaches, we should held ourselves accountable for supporting teams by creating the kind of environment in which transparency and curiosity lead to a durable and successful transformation.
It requires time and practice. Teams and individuals must explicitly dedicate time to train and practice the new mindset to make it understood, accepted and utilized. But it’s worth the hassle.
Let’s look forward to the 2021 as an year of joyful and healty practice and improvement.