Kahneman's two thinking systems
(41 intermediate revisions by one user not shown) | |||
Line 1: | Line 1: | ||
− | |||
− | As all decision-making emerges from the uncertainty, so does project management decisions. When trying to | + | == Two thinking system in short == |
− | This new scope takes its starting point in two systems used when evaluating a decision | + | |
+ | As all decision-making emerges from the uncertainty, so does project management decisions. When trying to describe the dynamics when assessing risk and reward, the lack of understanding the underlying rationale comes to mind. Daniel Kahneman, an Israelian-American Nobel prize winning psychologist, tries to elaborate the existing literature on uncertainties in decision-making, by introducing a new scope. | ||
+ | This new scope takes its starting point in two systems used when evaluating a decision or an information and are called system 1 and system 2. Or the fast and slow thinking system respectively. | ||
+ | |||
System 1 or the fast thinking system is defined by quick responses, automation and irrational thinking. | System 1 or the fast thinking system is defined by quick responses, automation and irrational thinking. | ||
− | System 2 or the slow thinking system is defined by thought through | + | System 2 or the slow thinking system is defined by thought through responses and rational thinking. |
+ | |||
+ | Kahneman describes the two systems as interdependent. System 2 is a slave to system 1 due to the cognitive biases our everyday experiences and impressions have imprinted in the unconscious. | ||
+ | |||
+ | When applying Kahneman's two system for project management it can be used to accommodate and explain uncertainty in decision-making. This could yield a better foundation to understand systematic errors and reveal flaws in a project. The development of the two systems also acknowledges the challenge of altering these systematic errors, as the system 2 mainly is used for decision-making in which system 1 can't handle the choice and when system 1 cannot provide a good enough answer. | ||
+ | |||
+ | |||
+ | Kahneman also argues that system 2 is brought to work when some decision is out of scope for system 1. This is supported by the estimation that system 1 is used for approx. 98 % of our decisions and system 2 is only used for approx. 2 % <ref name="one">Groenewegen, Astrid. ''Kahneman Fast And Slow Thinking Explained''. Behavioural Science. https://suebehaviouraldesign.com/kahneman-fast-slow-thinking/#system1-2 </ref>. This uneven balance implies that by this logic, humans are irrational by nature and that system 1 is used for not important decisions and low effort situations. | ||
+ | |||
+ | Kahneman's theory also have some limitations as the two thinking system to some extend is unsupported or at least not well documented. The challenge of letting the theory standing alone is also one point where Kahneman’s isolated experiments are somehow flawed. This was made clear as the errors in Kahneman's early work of the theory, with small sample size and “watching-eyes effect” should be taking into consideration when using the systems for decision-making <ref>Engber, Daniel. ''The irony effect''. Slate Group, 2016. https://slate.com/technology/2016/12/kahneman-and-tversky-researched-the-science-of-error-and-still-made-errors.html</ref>. | ||
+ | |||
+ | |||
+ | == Origin and relevance == | ||
+ | |||
+ | Daniel Kahneman is an Israelian-American Nobel prize winning psychologist and have investigated decision-making by humans. From his studies a book and the theory of same name, "Thinking, fast and slow", has emerged<ref name="Kahneman">Kahneman, Daniel. "Thinking, Fast and Slow". Penguin Books, 2011.https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374533555</ref>. By Kahneman, the central ideas of his theory occurred back in 1969, when he began interacting with Amos Tversky, whom he later initiated a close work-relationship with. In discussions between the two they revealed how they both were far too willing to accept research based on inadequate evidence. From this realization their many conversations, experiments and studies about how decision-making was biased, the theory came to life. The idea gave another view to the, by Kahneman’s evaluation, two broadly accepted idea in 1970’ies. The first was that people are generally rational. The second was that emotions as fear, hatred and affection explained most occasions where deviation from rationality happened. | ||
+ | |||
+ | To investigate the understanding of uncertainty Kahneman’s two thinking systems and his work on cognitive biases will be explored and described theoretically and with examples throughout the following. | ||
+ | |||
+ | When managing projects, programs or portfolios, uncertainty is certain for all levels of management<ref name="risk">Project Management Institute, Inc. (PMI). (2019)."Standard for Risk Management in Portfolios, Programs, and Projects". Project Management Institute, Inc. (PMI) P. 7-12. Retrieved from | ||
+ | https://app.knovel.com/hotlink/toc/id:kpSRMPPP01/standard-risk-management/standard-risk-managemen</ref> . Ideally all projects risk and uncertainty need to be evaluated, sometimes with risk anticipating tools and other times with the understanding of the pitfalls and challenges in the decision-making phase as a foundation for further handling or engagement of the team. | ||
+ | Uncertainty shows in many different ways, some easier to quantify than others. | ||
+ | More statistical uncertainty can be quantified based on earlier numbers and forecast, but decision-making for new solutions aren’t that easy to quantify and document. | ||
+ | This results in a lack of understanding and evaluating of these more qualitative decisions in projects, gives a higher level of uncertainty if these aren’t explored. | ||
+ | Kahneman’s studies are mostly based on experiments on actual people and how they interpret and evaluate specific tasks in a closed environment without the many influencing circumstances in everyday tasks. This is both the strength and the weakness of Kahneman’s theory. This closed environment shows the effect of changing one type of information and see the response. Contrarily it could be argued that it would not make sense to remove the decision from the environment in which it is taken, and the effect would fall when more inputs are available. Anyway, the decision-making is given a new layer of thinking with the two thinking system. | ||
+ | |||
+ | |||
+ | == The theoretical framework of the two thinking system == | ||
+ | |||
+ | Daniel Kahneman describes the decision-making in the human brain by two systems, the fast-thinking system and the slow thinking system. These two also goes by the names system 1 and system 2 respectively. The two systems represent their separate way of analyzing and evaluating all possible choices. These choices can vary greatly in complexity, from simple as taking another step, to solving hardly advanced equations by hand. The complexity in the decision-making is essential for the understanding of the two systems. | ||
+ | As we as humans unconsciously on averages have 35.000 decisions to make everyday<ref name="one" />, it sounds reasonable that we aren't able to make informed and rational decisions in every task, this would be too high of a workload for our brain and take way to long. | ||
+ | System 1 is used for all fast decisions and impressions based on earlier experiences and the biases accumulated in our mind. Furthermore system 1 is effortless and happens by automation without us noticing when the impression, decision or uncertainty is introduced. | ||
+ | System 2 is contrarily used for the more complex tasks when system 1 isn't able to take a sound and fast decision due to lack experiences or specific challenges with demanding effort needed. | ||
+ | The differentiation of the systems is essential in understanding how these works together when delegating and solving the tasks at hand between them and thereby Kahneman’s theory. | ||
+ | To dig a bit more into the interdependencies of the two systems, the balance between them reveals a very uneven relation. | ||
+ | System 1 accounts for approx. 98 % of our daily decisions and system 2 only accounts for approx. 2 % <ref name="one" />. This uneven balance gives rise to the thought that we aren’t rational as human beings. Few people without knowledge of Kahneman’s studies at hand would probably insist that they take more than 2 % rational and though through decisions a day. This flaw is somehow explained by the interlinkage of the systems. When system 1 gets a task as input, it is processed with fast search for a reasonable answer. If this answer seems plausible and is often seen before from experience it is accepted by the system 2 without giving the subject slower thinking. Only when the task is found to be to complex or needs more attention system 2 is fully activated to analyze and evaluated more comprehensively. | ||
+ | |||
+ | When looking further into the interactions of the systems Kahneman also described the term cognitive biases. | ||
+ | Cognitive biases are systematic errors or deflections from the rational decision-making in system 2. These are a result of heuristic-like biases in the way we assess and evaluated information in system 1. The shortcuts of different prioritizing and rationalizing often yields decisions or answers good enough, but sometimes they yield answers much different from the system 2’s rational decisions. This is the reason that humans sometimes deflect from some rational decisions. | ||
+ | A lot of these cognitive biases has been named along with the studies of the specific’s decision making, some of the most commonly will be briefly explained and used in context in the following to give a more detailed view on how these biases works. | ||
+ | |||
+ | |||
+ | Confirmation bias <ref name="bias">Shewan, Dan. "5 Cognitive Biases & How to Overcome Them On Your Landing Pages". WordStream, 2018. https://www.wordstream.com/blog/ws/2014/05/22/landing-pages-cognitive-biases</ref> is when searching for the answers one may expect. This means that we are more likely to reject an answer if it goes against our first impulse. | ||
+ | |||
+ | The confirmation biases could easily lead to systemic errors inside a company if the usual way of interpreting a given task and the outcome is done without any indicators of performance. When left without acceptable indicators one may be more selective about the research to fit the expected answer and actively seek information that confirms initial thought. | ||
+ | |||
+ | Another common bias is the Anchoring Effect <ref name="bias" />. By the anchoring effect one is prone to focus on a single aspect and exclude other aspects. This could be when management only focus on the cost of a product and doing all possible to maintain or decrease cost neglecting areas as security, quality, time etc. This could happen both in the way we interpret uncertainty inside and outside the scope of a project or company. | ||
+ | |||
+ | The ambiguity effect <ref name="bias" /> is also common within the decision-making. When assessing a task or decision one will often favor the choice with a known outcome instead of taking a chance on a choice with unknown probabilities. This could lead to opportunities being missed without the prober foundation of evidence and evaluation of the initiative. | ||
+ | |||
+ | As cognitive biases are used many times a day it is inevitably that we somehow make mistakes in the way we perceive information. More important is it to recognize the need for these heuristics and the way they are helping us and allow us to get through the day without brain overload from every little information throughout the day. | ||
+ | |||
+ | When discussing the relationship between the two systems another two terms are being introduced by Kahneman <ref name="Kahneman" />. First, the concept of cognitive ease, which is described by as a way to ease the person into thinking that your proposed idea is true. | ||
+ | Kahneman gives the following example: | ||
+ | |||
+ | '''Adolf Hitler was born in 1892''' | ||
+ | |||
+ | Adolf Hitler was born in 1887 | ||
+ | |||
+ | Both statements are false, as Hitler was born in 1889. Through Kahneman’s experiments he found that the first statement was more likely to be believed. Along with other small changes as high-quality paper and high contrasts between characters also showed to be pushing the mind towards the first statement. By fairly simple features the mind can be tricked into perceiving a message with a less critical view and thereby show some flaws in the way we assess the information given. | ||
+ | This knowledge can be used in many different settings, including project management when trying to point the team in the wanted direction. The intended perception can be used to manipulate the likeliness of accepting new ideas. By the logic cognitive ease, it can be shown that we are susceptible to priming from different tools. This could be small unconscious things to do before a decision that would make one more prone to reacting more positively or negative to the information given. This could be to force a smile or frown eyebrows that unknowingly would prime a person in a specific direction. | ||
+ | |||
+ | Secondly this gives birth to the opposite term, cognitive strain. This is described as interruptions in the way we easily assess information alerting system 2 and the slower thinking. Kahneman presents the example of two identical statements where one is blurred and harder to read and the other is clear with easily contrast. | ||
+ | Kahneman showed that the blurred statement was more likely to be rejected as the blurred lines made the participant to examine the statement more critically by activating the need for system 2. This indicates that the way a message or task is presented is more important than earlier presumed. By not taken these simple measures into account the likeliness of acceptance of a message could drop and work against the initial thought. | ||
+ | |||
+ | With the new way of exploring and evaluating the rationality of decision-making, Kahneman’s research and development changed the scope of how choices was made inside and outside of project management. | ||
+ | |||
+ | ==Application of the two thinking system== | ||
+ | |||
+ | As mentioned earlier, uncertainty is very common within all decisions and the managing of these. The risk involved in the portfolio, program or project can determine if it becomes a success or if it fails. Without risk and uncertainty management the needed understanding and preventive initiatives can be developed. | ||
− | + | When starting a new project, one of the first thing to look for after the project definition is the uncertainty inside and outside the project framework. This could result in some risk in many different areas, with different perspectives. When conduction a cause and effect diagram to initialize the system thinking the theory of Kahneman also helps to understand the measures of how the different aspects interlink <ref>Züst and Troxler, Rainer and Peter. "No More Muddling Through" p 61-63. Springer, 2006 </ref>. | |
− | + | This could be both enhancing and reduce the interdependencies. Another one of these uncertainty areas is the people involved. | |
+ | This could be all types of stakeholders with different perspectives on the project, both the customer reactions, employee investment and sponsor satisfaction. No need to say that all these uncertainties can vary greatly from stakeholder to stakeholder. | ||
+ | For a project manager the prioritizing, assessment and prevention of these risks are very important to ensure the success of the project, which is why an extended understanding of the uncertainty aspect is welcomed. | ||
+ | What makes the two thinking theory easily adaptable is the fact that you are not able to limit the theory to a single aspect of either portfolio, program or project management, it gives another view of how the human brain choose in certain circumstances. This is also why there isn’t a guide on how to use it specifically. | ||
+ | Even though it is not a specific tool it can be used to support specific tools or understand complex decision-making both for the individual and the masses. | ||
+ | One example could be in the creation of a risk matrix<ref>Wikipedia. "Risk matrix". 2021. https://en.wikipedia.org/wiki/Risk_matrix</ref> and where the different risk is located. A project manager would likely give more attention to uncertainty in decision-making with the understanding of how the two thinking system affects the way we interpret information and decide in general. The project manager could change the attitude towards risk<ref name="risk2">Project Management Institute, Inc. (PMI). (2019)."Standard for Risk Management in Portfolios, Programs, and Projects". Project Management Institute, Inc. (PMI) P. 8-9. Retrieved from | ||
+ | https://app.knovel.com/hotlink/toc/id:kpSRMPPP01/standard-risk-management/standard-risk-managemen</ref> in a project and thereby the prioritizing. If a lot of the key performance indicator is based on how people, both inside and outside the project, reacts to aim of the project, the manager would favor a more conservative attitude towards risk with the knowledge of the cognitive biases and everyday heuristics. | ||
− | + | It could also affect the preventive initiatives by implementing the balance between the fast and slow thinking system in favor of the wanted outcome. By using the concept of cognitive ease a certain attitude to uncertainty can be primed and create new possibilities within a project. | |
+ | This can be explained and accommodated when realizing the challenge of humans’ likeliness to solve a task the way it normally is done. When new structures or tasks are implemented the leaders of the change will can help the transition by highlighting the improvements. This could also help the management to stay on the new path as the resistance is expected and therefore not doubted and discarded much earlier in the process. | ||
+ | == Limitations and reflections in the theoretical and applied framework == | ||
+ | |||
+ | When looking with a critical eye on the two thinking system, a few thoughts arise. | ||
+ | In the experiments regarding the closed environments, it was clear that the altering of single attributes showed a change in the perception of the information presented. If these experiments are representative to the reality is a bit harder to confirm. It could be argued that the context and other attributes all influencing the decision and enhance of reduce the decision in general. This dual behavioral reasoning has been questioned as it lacks applicability and validity, challenging the accuracy of the simplified two thinking system<ref>Grayot, James. "Dual Process Theories in Behavioral Economics and Neuroeconomics: a Critical Review". Springer, 2019</ref>. | ||
− | + | In portfolio, program and project management the two thinking system can be used in a lot of places concerning people and perception of information. But as the theory mainly are able to contribute to the general understanding uncertainty and to support other specific management tools, it can easily be overlooked and neglected. Furthermore, it is hard to see that the two thinking system could be the only applied theory as not all uncertainty is measurable and explainable with the theory. | |
− | + | ||
− | + | ||
− | + | ||
+ | == Annotated bibliography == | ||
− | |||
− | |||
− | |||
− | |||
+ | '''Thinking, Fast and Slow''' <ref name="Kahneman" />. | ||
+ | Kahneman's view on the the two thinking systems is presented with examples of his methods and experiments. The development of the system and the cognitive biases used is described to elaborate the way the systems interacts. The used biases and effects these have on the decision-making for the individual is explained to use in real life examples. With Kahneman's focus, actual decision and comparison to real life decisions is also available for further investigation of the systems in field work. | ||
− | + | == References == | |
+ | <references /> |
Latest revision as of 22:35, 27 February 2021
Contents |
[edit] Two thinking system in short
As all decision-making emerges from the uncertainty, so does project management decisions. When trying to describe the dynamics when assessing risk and reward, the lack of understanding the underlying rationale comes to mind. Daniel Kahneman, an Israelian-American Nobel prize winning psychologist, tries to elaborate the existing literature on uncertainties in decision-making, by introducing a new scope. This new scope takes its starting point in two systems used when evaluating a decision or an information and are called system 1 and system 2. Or the fast and slow thinking system respectively.
System 1 or the fast thinking system is defined by quick responses, automation and irrational thinking. System 2 or the slow thinking system is defined by thought through responses and rational thinking.
Kahneman describes the two systems as interdependent. System 2 is a slave to system 1 due to the cognitive biases our everyday experiences and impressions have imprinted in the unconscious.
When applying Kahneman's two system for project management it can be used to accommodate and explain uncertainty in decision-making. This could yield a better foundation to understand systematic errors and reveal flaws in a project. The development of the two systems also acknowledges the challenge of altering these systematic errors, as the system 2 mainly is used for decision-making in which system 1 can't handle the choice and when system 1 cannot provide a good enough answer.
Kahneman also argues that system 2 is brought to work when some decision is out of scope for system 1. This is supported by the estimation that system 1 is used for approx. 98 % of our decisions and system 2 is only used for approx. 2 % [1]. This uneven balance implies that by this logic, humans are irrational by nature and that system 1 is used for not important decisions and low effort situations.
Kahneman's theory also have some limitations as the two thinking system to some extend is unsupported or at least not well documented. The challenge of letting the theory standing alone is also one point where Kahneman’s isolated experiments are somehow flawed. This was made clear as the errors in Kahneman's early work of the theory, with small sample size and “watching-eyes effect” should be taking into consideration when using the systems for decision-making [2].
[edit] Origin and relevance
Daniel Kahneman is an Israelian-American Nobel prize winning psychologist and have investigated decision-making by humans. From his studies a book and the theory of same name, "Thinking, fast and slow", has emerged[3]. By Kahneman, the central ideas of his theory occurred back in 1969, when he began interacting with Amos Tversky, whom he later initiated a close work-relationship with. In discussions between the two they revealed how they both were far too willing to accept research based on inadequate evidence. From this realization their many conversations, experiments and studies about how decision-making was biased, the theory came to life. The idea gave another view to the, by Kahneman’s evaluation, two broadly accepted idea in 1970’ies. The first was that people are generally rational. The second was that emotions as fear, hatred and affection explained most occasions where deviation from rationality happened.
To investigate the understanding of uncertainty Kahneman’s two thinking systems and his work on cognitive biases will be explored and described theoretically and with examples throughout the following.
When managing projects, programs or portfolios, uncertainty is certain for all levels of management[4] . Ideally all projects risk and uncertainty need to be evaluated, sometimes with risk anticipating tools and other times with the understanding of the pitfalls and challenges in the decision-making phase as a foundation for further handling or engagement of the team. Uncertainty shows in many different ways, some easier to quantify than others. More statistical uncertainty can be quantified based on earlier numbers and forecast, but decision-making for new solutions aren’t that easy to quantify and document. This results in a lack of understanding and evaluating of these more qualitative decisions in projects, gives a higher level of uncertainty if these aren’t explored. Kahneman’s studies are mostly based on experiments on actual people and how they interpret and evaluate specific tasks in a closed environment without the many influencing circumstances in everyday tasks. This is both the strength and the weakness of Kahneman’s theory. This closed environment shows the effect of changing one type of information and see the response. Contrarily it could be argued that it would not make sense to remove the decision from the environment in which it is taken, and the effect would fall when more inputs are available. Anyway, the decision-making is given a new layer of thinking with the two thinking system.
[edit] The theoretical framework of the two thinking system
Daniel Kahneman describes the decision-making in the human brain by two systems, the fast-thinking system and the slow thinking system. These two also goes by the names system 1 and system 2 respectively. The two systems represent their separate way of analyzing and evaluating all possible choices. These choices can vary greatly in complexity, from simple as taking another step, to solving hardly advanced equations by hand. The complexity in the decision-making is essential for the understanding of the two systems. As we as humans unconsciously on averages have 35.000 decisions to make everyday[1], it sounds reasonable that we aren't able to make informed and rational decisions in every task, this would be too high of a workload for our brain and take way to long. System 1 is used for all fast decisions and impressions based on earlier experiences and the biases accumulated in our mind. Furthermore system 1 is effortless and happens by automation without us noticing when the impression, decision or uncertainty is introduced. System 2 is contrarily used for the more complex tasks when system 1 isn't able to take a sound and fast decision due to lack experiences or specific challenges with demanding effort needed. The differentiation of the systems is essential in understanding how these works together when delegating and solving the tasks at hand between them and thereby Kahneman’s theory. To dig a bit more into the interdependencies of the two systems, the balance between them reveals a very uneven relation. System 1 accounts for approx. 98 % of our daily decisions and system 2 only accounts for approx. 2 % [1]. This uneven balance gives rise to the thought that we aren’t rational as human beings. Few people without knowledge of Kahneman’s studies at hand would probably insist that they take more than 2 % rational and though through decisions a day. This flaw is somehow explained by the interlinkage of the systems. When system 1 gets a task as input, it is processed with fast search for a reasonable answer. If this answer seems plausible and is often seen before from experience it is accepted by the system 2 without giving the subject slower thinking. Only when the task is found to be to complex or needs more attention system 2 is fully activated to analyze and evaluated more comprehensively.
When looking further into the interactions of the systems Kahneman also described the term cognitive biases. Cognitive biases are systematic errors or deflections from the rational decision-making in system 2. These are a result of heuristic-like biases in the way we assess and evaluated information in system 1. The shortcuts of different prioritizing and rationalizing often yields decisions or answers good enough, but sometimes they yield answers much different from the system 2’s rational decisions. This is the reason that humans sometimes deflect from some rational decisions. A lot of these cognitive biases has been named along with the studies of the specific’s decision making, some of the most commonly will be briefly explained and used in context in the following to give a more detailed view on how these biases works.
Confirmation bias [5] is when searching for the answers one may expect. This means that we are more likely to reject an answer if it goes against our first impulse.
The confirmation biases could easily lead to systemic errors inside a company if the usual way of interpreting a given task and the outcome is done without any indicators of performance. When left without acceptable indicators one may be more selective about the research to fit the expected answer and actively seek information that confirms initial thought.
Another common bias is the Anchoring Effect [5]. By the anchoring effect one is prone to focus on a single aspect and exclude other aspects. This could be when management only focus on the cost of a product and doing all possible to maintain or decrease cost neglecting areas as security, quality, time etc. This could happen both in the way we interpret uncertainty inside and outside the scope of a project or company.
The ambiguity effect [5] is also common within the decision-making. When assessing a task or decision one will often favor the choice with a known outcome instead of taking a chance on a choice with unknown probabilities. This could lead to opportunities being missed without the prober foundation of evidence and evaluation of the initiative.
As cognitive biases are used many times a day it is inevitably that we somehow make mistakes in the way we perceive information. More important is it to recognize the need for these heuristics and the way they are helping us and allow us to get through the day without brain overload from every little information throughout the day.
When discussing the relationship between the two systems another two terms are being introduced by Kahneman [3]. First, the concept of cognitive ease, which is described by as a way to ease the person into thinking that your proposed idea is true. Kahneman gives the following example:
Adolf Hitler was born in 1892
Adolf Hitler was born in 1887
Both statements are false, as Hitler was born in 1889. Through Kahneman’s experiments he found that the first statement was more likely to be believed. Along with other small changes as high-quality paper and high contrasts between characters also showed to be pushing the mind towards the first statement. By fairly simple features the mind can be tricked into perceiving a message with a less critical view and thereby show some flaws in the way we assess the information given. This knowledge can be used in many different settings, including project management when trying to point the team in the wanted direction. The intended perception can be used to manipulate the likeliness of accepting new ideas. By the logic cognitive ease, it can be shown that we are susceptible to priming from different tools. This could be small unconscious things to do before a decision that would make one more prone to reacting more positively or negative to the information given. This could be to force a smile or frown eyebrows that unknowingly would prime a person in a specific direction.
Secondly this gives birth to the opposite term, cognitive strain. This is described as interruptions in the way we easily assess information alerting system 2 and the slower thinking. Kahneman presents the example of two identical statements where one is blurred and harder to read and the other is clear with easily contrast. Kahneman showed that the blurred statement was more likely to be rejected as the blurred lines made the participant to examine the statement more critically by activating the need for system 2. This indicates that the way a message or task is presented is more important than earlier presumed. By not taken these simple measures into account the likeliness of acceptance of a message could drop and work against the initial thought.
With the new way of exploring and evaluating the rationality of decision-making, Kahneman’s research and development changed the scope of how choices was made inside and outside of project management.
[edit] Application of the two thinking system
As mentioned earlier, uncertainty is very common within all decisions and the managing of these. The risk involved in the portfolio, program or project can determine if it becomes a success or if it fails. Without risk and uncertainty management the needed understanding and preventive initiatives can be developed.
When starting a new project, one of the first thing to look for after the project definition is the uncertainty inside and outside the project framework. This could result in some risk in many different areas, with different perspectives. When conduction a cause and effect diagram to initialize the system thinking the theory of Kahneman also helps to understand the measures of how the different aspects interlink [6].
This could be both enhancing and reduce the interdependencies. Another one of these uncertainty areas is the people involved. This could be all types of stakeholders with different perspectives on the project, both the customer reactions, employee investment and sponsor satisfaction. No need to say that all these uncertainties can vary greatly from stakeholder to stakeholder. For a project manager the prioritizing, assessment and prevention of these risks are very important to ensure the success of the project, which is why an extended understanding of the uncertainty aspect is welcomed. What makes the two thinking theory easily adaptable is the fact that you are not able to limit the theory to a single aspect of either portfolio, program or project management, it gives another view of how the human brain choose in certain circumstances. This is also why there isn’t a guide on how to use it specifically. Even though it is not a specific tool it can be used to support specific tools or understand complex decision-making both for the individual and the masses.
One example could be in the creation of a risk matrix[7] and where the different risk is located. A project manager would likely give more attention to uncertainty in decision-making with the understanding of how the two thinking system affects the way we interpret information and decide in general. The project manager could change the attitude towards risk[8] in a project and thereby the prioritizing. If a lot of the key performance indicator is based on how people, both inside and outside the project, reacts to aim of the project, the manager would favor a more conservative attitude towards risk with the knowledge of the cognitive biases and everyday heuristics.
It could also affect the preventive initiatives by implementing the balance between the fast and slow thinking system in favor of the wanted outcome. By using the concept of cognitive ease a certain attitude to uncertainty can be primed and create new possibilities within a project. This can be explained and accommodated when realizing the challenge of humans’ likeliness to solve a task the way it normally is done. When new structures or tasks are implemented the leaders of the change will can help the transition by highlighting the improvements. This could also help the management to stay on the new path as the resistance is expected and therefore not doubted and discarded much earlier in the process.
[edit] Limitations and reflections in the theoretical and applied framework
When looking with a critical eye on the two thinking system, a few thoughts arise. In the experiments regarding the closed environments, it was clear that the altering of single attributes showed a change in the perception of the information presented. If these experiments are representative to the reality is a bit harder to confirm. It could be argued that the context and other attributes all influencing the decision and enhance of reduce the decision in general. This dual behavioral reasoning has been questioned as it lacks applicability and validity, challenging the accuracy of the simplified two thinking system[9].
In portfolio, program and project management the two thinking system can be used in a lot of places concerning people and perception of information. But as the theory mainly are able to contribute to the general understanding uncertainty and to support other specific management tools, it can easily be overlooked and neglected. Furthermore, it is hard to see that the two thinking system could be the only applied theory as not all uncertainty is measurable and explainable with the theory.
[edit] Annotated bibliography
Thinking, Fast and Slow [3].
Kahneman's view on the the two thinking systems is presented with examples of his methods and experiments. The development of the system and the cognitive biases used is described to elaborate the way the systems interacts. The used biases and effects these have on the decision-making for the individual is explained to use in real life examples. With Kahneman's focus, actual decision and comparison to real life decisions is also available for further investigation of the systems in field work.
[edit] References
- ↑ 1.0 1.1 1.2 Groenewegen, Astrid. Kahneman Fast And Slow Thinking Explained. Behavioural Science. https://suebehaviouraldesign.com/kahneman-fast-slow-thinking/#system1-2
- ↑ Engber, Daniel. The irony effect. Slate Group, 2016. https://slate.com/technology/2016/12/kahneman-and-tversky-researched-the-science-of-error-and-still-made-errors.html
- ↑ 3.0 3.1 3.2 Kahneman, Daniel. "Thinking, Fast and Slow". Penguin Books, 2011.https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374533555
- ↑ Project Management Institute, Inc. (PMI). (2019)."Standard for Risk Management in Portfolios, Programs, and Projects". Project Management Institute, Inc. (PMI) P. 7-12. Retrieved from https://app.knovel.com/hotlink/toc/id:kpSRMPPP01/standard-risk-management/standard-risk-managemen
- ↑ 5.0 5.1 5.2 Shewan, Dan. "5 Cognitive Biases & How to Overcome Them On Your Landing Pages". WordStream, 2018. https://www.wordstream.com/blog/ws/2014/05/22/landing-pages-cognitive-biases
- ↑ Züst and Troxler, Rainer and Peter. "No More Muddling Through" p 61-63. Springer, 2006
- ↑ Wikipedia. "Risk matrix". 2021. https://en.wikipedia.org/wiki/Risk_matrix
- ↑ Project Management Institute, Inc. (PMI). (2019)."Standard for Risk Management in Portfolios, Programs, and Projects". Project Management Institute, Inc. (PMI) P. 8-9. Retrieved from https://app.knovel.com/hotlink/toc/id:kpSRMPPP01/standard-risk-management/standard-risk-managemen
- ↑ Grayot, James. "Dual Process Theories in Behavioral Economics and Neuroeconomics: a Critical Review". Springer, 2019