Biases in Project Management

From apppm
Revision as of 17:43, 19 February 2021 by Evarun96 (Talk | contribs)

Jump to: navigation, search

Contents

Abstract

The human mind is an effective and powerful tool. However, it is not without faults and has some limitations which causes biases. Biases occur in everyday activities and decisions, it is the brains' way of dealing with uncertainty and complexity, often in wrong ways. There are so many seemingly trivial decisions made on the basis of biases, whether it is choosing a cola drink or which car to buy. In this article cognitive biases are examined, with most emphasis on optimism bias since it is a very important factor in project management, specifically in risk assessments. Cognitive Bias also includes other topics such as Gender Bias, Stereotyping and Information Bias. The notion of biases has evolved through time and the understanding of them has been steadily increasing. These biases are very important in a team setting and therefore fall under the realm of project management. It can be found in project management literature when team building is discussed e.g. in Guide to the Project Body of Knowledge (PMBOK® Guide) where Interpersonal and Team Skills or Expert Skills are mentioned. [1] Project managers have a tendency to overestimate benefits and underestimate cost i.e., be too optimistic. This is known as “Optimism Bias” and is widely accepted as a key reason for overruns in projects, especially in large infrastructure projects. [2]

Being aware of these biases is crucial for all project managers in order to be able to offset them. By acknowledging biases and applying appropriate measures, it is possible to counter the effects.

In this article these biases related to Project Management are examined in more detail. How these biases can be seen in project management and measures to counter them are presented as well as how they can be applied and when. Finally, some limitations are considered and topics for further reading recommended.

The Big Idea

What is bias?

The definition of bias in the Oxford dictionary is split in four meanings, two of whom are relevant in project management and will be addressed in this article:

  1. “a strong feeling in favour of or against one group of people, or one side in an argument, often not based on fair judgement.” [3]
  2. “the fact that the results of research or an experiment are not accurate because a particular factor has not been considered when collecting the information.” [3]

The first definition is tied to people and communications between either team members or stakeholders. The latter can be more related to uncertainty and risk management.


Limited information, limited time and limited minds are three causes for bad decision making. In many cases we rely on heuristics instead. Heuristics are approximate methods used to simplify decision making, often choosing a satisfactory solution instead of the optimal one. Sometimes heuristics lead to good decisions, but they can also lead to inaccurate conclusions. [4]


Cognitive Bias

The concept of cognitive bias was coined by Amos Tversky and Daniel Kahneman in the 1970s. Cognitive biases are a result of your mind trying to simplify things when the human mind is forced to make a decision and deal with complexity or uncertainty. This can lead to incorrect judgement and as a result a miscalculation of the situation or project at hand. Kahneman and Tversky found that these errors were often systematic rather than random and therefore they are sometime referred to as Systematic Biases. Systematic Biases are frequent distortions in the human mind, often contrary to rational thought. [5] Cognitive biases can be an advantage when a quick respond is more valuable than an exact right solution. In some situations, it is crucial to make decisions timely for example when in life threatening situations. Project managers are usually not faced with those situations in their professional life and accuracy is often preferred over quick responses so biases are more often seen as disadvantages in project management.

Wikipedia has a page dedicated to different types of biases titled “List of cognitive biases” which contains 185 different types. Only a few will be mentioned in this article. The following section introduce these biases with some basic definitions. [6]

Gender Bias Gender Bias is quite self-explanatory, people are misjudged based on gender. Typically, women are not recognized as equals to men in fields that have been male dominated through the years. [7]

Confirmation Bias when people look for confirmation or evidence to support what they already belief and ignore information that contradicts it. [4]

Misconceptions of chance / The Gambler's Fallacy A gambler who has been on a losing streak feels he is due to win soon even though each game is independent of the other. [5]

Overconfidence effect Then the team or an individual team member is overconfident without any evidence supporting their belief. [8]

Recency illusion Too much emphasis is put on recent data, often older data is more relevant. [8]

Groupthink occurs then team members think alike, and they do not accept evidence that proves otherwise. [8]

Conservatism when team members is will not take into consideration new information or any negative feedback. [8]

Optimism Bias

A large number of recent projects have had cost overruns and/or demand shortfalls. It is widely accepted that Optimism Bias is to blame for these miscalculations. Project Managers tend to be too optimistic when calculating the benefits of projects and downplay the costs. This is most evident in large infrastructure projects, especially in the public sector where politics play a big role. [2]

Bent Flyvbjerg of Oxford University, wrote a paper about megaprojects in 2014 where he lists many past projects with cost overruns and how large the overrun was. Five of these projects have a cost overrun of over 1000% The highest overrun is the Suez Canal with a cost overrun of 1900% These numbers are unacceptable since the methods and technology to predict these numbers is available. Project managers must put more emphasis on cost and benefit estimations.[9]

Application

How are these biases applied in project management and what can managers do to combat them. Make and Preston documented 21 sources of error and biases in transport project appraisal e.g. double counting or interactions that are not taken into consideration in models, unclear objectives, and incorrect definitions of study area, base or assumptions to name a few. All these factors contribute to the last one, Optimism Bias. Benefits are sometimes counted more than once, quantifiable costs excluded, and the asset live is overestimated. In their paper they call it Appraisal Optimism and say it is the greatest problem of all. They suggest three solutions, an in-house group to ensure honest appraisals, more transparency to the public and extra emphasis on ex-post evaluation. [10]

Dealing with Biases in Teams

Failure rates among projects is definitely too high and a poses concern, that is why studying why failures happen is becoming more and more common and important. How a project turns out is the result of many factors e.g. leadership, cultural and behavioral factors where biases and human emotions play important roles. Culture relates to values and beliefs of a group that is adopted at early stages of life and therefore hard to change. Individual cultures and leadership both influence the organizational or project culture. Project managers must try to foster a project culture where everyone feels safe and respected. Biases can have a huge effect on the project culture and team spirit. Gender biases, Conservatism and Groupthink can influence people’s actions and feelings. These types of biases can be toxic in a team setting. It is up to the project manager to acknowledge these biases and make sure the team is not affected by them for example by finding mutual ground among team members in the beginning of a project. [11]

Purvis et. al. (2004) proposed 8 tactics to battle biases and minimize the probability of a failure. First a formal kick-off event where previous projects are discussed, what went wrong and how is it possible to prevent it happening again? To make sure decision processes are based on objective data, to clearly specify methods for planning, to look at both positive and negative sides of the project and to inform the team members about the project management process, specifically how to address unexpected events. They also mention the importance of constructing more than one alternative approach to the project, to institute a committee to oversee the project and to formulate procedures intended for positive feedback to employees. [12]


Dealing with Optimism Bias: Reference Class Forecasting and corresponding uplifts

Reference Class Forecasting is a method to improve the reliability of project cost estimates. It originated with the aforementioned psychologists Tversky and Kahneman whom felt this method could compensate for the cognitive bias of decision makers. Kahneman developed the theoretical framework which earned him a Nobel Prize in economics. Reference Class Forecasting focuses on including historical data as a reference point and therefore taking an “outside view”. Instead of only looking at the specific problem at hand, similar projects are analyzed and information from them transferred to the current problem. The data collected is on how well the projects delivered the planned benefits and if they were on time and on budget, if not than by how much. By doing this it is possible to learn from past mistakes and the estimates become more realistic and accurate. Though the theories were originally introduced by Kahneman he did not develop the practical use of the method. Bent Flyvbjerg, a Danish economist and professor at Oxford University, in association with COWI, developed the practical method for use in planning projects. The paper “Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice” documents the application process and …

American Planning Association (APA) recommended the use of Reference Class Forecasting for project planning in 2005 and since then it has grown in popularity.

The Reference Class Forecasting method has three steps. First a relevant reference class needs to be found. Secondly, a probability distribution needs to be selected for the reference class which means finding data for other projects in the same reference class and using it to make an assertion about the projects in this class. Finally, the projects have to be compared to the distribution from the reference class and adjusted. Flyvbjerg examined 260 infrastructure projects and put in a database. He documented similarities between projects and classified into reference classes. The three main groups were, Roads, Rail and Fixed Links, which includes bridges and tunnels. [13]

By examining former projects and their cost overruns, a probability distribution was created for each class, seen in Figures …. Each class has a required uplift that is needed in step 3 to adjust the new project to the distribution. Managers and project owners need to establish what the accepted risk of overrun and find the appropriate uplift. As the risk gets lower the uplift gets higher. That is if the project owners are willing to accept a 30% chance of a overrun for a railway project the uplift would be 51%, however of the allowable risk is only 10% the uplift required according to RCF is 68%.

Limitations

There will always be some biases that we are not able to prevent and we can never completely be free of biases but by being aware of them can possibly diminish the effects. Project estimations will never be completely accurate but they could be a lot better. Some events cannot be predicted and therefore never included in estimates, for example pandemics. Many projects have had delays and unexpected costs in relation to Covid 19 but that is inevitable in these unprecedented times.

The main limitations of Reference Class Forecasting is in acquiring a useful and reliable dataset for the reference class and to choose the right class for the project. [13] Tim Neerup Themsen (2019) suggests that the Reference Class Forecasting method is not as good as people belief. He bases his research on a Danish Megaproject, Signalling Programme, that did not deliver the promised results despite Reference Class Forecasting being used for estimating. The project had cost overruns, was delayed and had to reduce the scope. Themsen believes that the estimation experts showed signs of biases and that RCF does not prevent optimism bias. No outside stakeholders questioned the application of RFC because they were made to believe it was superior to all other methods. Themsen poses the question of how many projects have to fail in order for people to start questioning RCF. He does not propose that people stop using RCF, but to focus on the conditions that the estimations were made under and to have an impartial reviewer. It must be noted that it was the first big Danish public project to use the RCF method so perhaps it was not carried out correctly and future projects could be executed better. [14]

Optimism bias has almost exclusively been researched in regard to transport and infrastructure projects. It is possible to transfer the findings of these problems onto other types of projects, but it might deliver different results. Researching optimism bias in more general project management aspect could be a relevant topic for further research.

Annotated bibliography

The following resources are the key resources used for this article, and can provide basis for further and deeper studies on the topic.

1. Leleur, S., Salling, K.B., Pilkauskiene, I. and Nicolaisen, M.S. (2015). Combining Reference Class Forecasting with Overconfidence Theory for Better Risk Assessment of Transport Infrastructure. The European Journal of Transport and Infrastructure Research, 15(3), 362-375.

This article highlights the importance of risk assessment in infrastructure projects since large number of projects have has cost overruns. It explains the concept of Optimism Bias and provides measures to combat it. The article recommends using Reference Class Forecasting, overconfidence theory and to interpret expert judgements about benefit and cost estimation, and provides instructions on these methods.

2. Tversky, A. and Kahneman, D. (1974) Judgement under Uncertainty: Heuristics and Biases. Science, New Series, 185(4157), 1124-1131.

This article was written by two psychologists in 1974 and was revolutionary within the field of psychology. It introduced the idea of Heuristics and Cognitive Biases and provided basic examples. Tversky and Kahneman wrote many other articles about this subject, all of whom are interesting and relevant to biases. The article has many examples but focuses on three heuristics employed in decisions under uncertainty, namely Representativeness, Availability of instances and Adjustments from an anchor. This article explains these heuristics and biases in very simple terms and is therefore a good reading for those interested in the psychology aspect of biases.

References

  1. Project Management Institute, Inc.(PMI). (2017). Guide to the Project Management Body of Knowledge (PMBOK® Guide) (6th Edition). Retrieved on February 9th 2021 from https://app.knovel.com/hotlink/toc/id:kpGPMBKP02/guide-project-management/guide-project-management.
  2. 2.0 2.1 Leleur, S., Salling, K.B., Pilkauskiene, I. and Nicolaisen, M.S. (2015). Combining Reference Class Forecasting with Overconfidence Theory for Better Risk Assessment of Transport Infrastructure. The European Journal of Transport and Infrastructure Research (EJTIR), 15(3), 362-375. Retrieved on February 10th 2021 from https://www.researchgate.net/publication/275213953_Combining_Reference_Class_Forecasting_with_Overconfidence_Theory_for_Better_Risk_Assessment_of_Transport_Infrastructure_Investments .
  3. 3.0 3.1 Oxford University Press. (2021). bias noun. Retrieved from https://www.oxfordlearnersdictionaries.com/definition/english/bias_1?q=bias on February 9th 2021.
  4. 4.0 4.1
  5. 5.0 5.1 Tversky, A. and Kahneman, D. (1974) Judgement under Uncertainty: Heuristics and Biases. Science, New Series, 185(4157), 1124-1131. Retrieved on February 10th from http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=4F92E2FFA38970D381524DF81AF1D10F?doi=10.1.1.207.2148&rep=rep1&type=pdf
  6. Wikipedia. List of cognitive biases. Retrieved on February 10th from https://en.wikipedia.org/wiki/List_of_cognitive_biases
  7. Pinto, J. K., Taranakul, P. and Pinto, M. B. (2017). “The aura of capability”: Gender bias in selection for a project manager job. International Journal of Project Management, 35(3), 420-431. Retrieved on February 11th from https://www.sciencedirect.com/science/article/pii/S0263786317300297?casa_token=eTCIzCrkRNcAAAAA:fNB7WE_DQurrkYOPS6EukGS3VC7Uelk63TKAOgGiuEtawgXtKMq4mZacbm8nvoq9178g0MnnvA#bb0225
  8. 8.0 8.1 8.2 8.3 Shore, B. (2008). Systematic Biases and Culture in Project Failures. Project Management Journal, 39(4), 5–16. Retrieved on 14th of February from https://journals.sagepub.com/doi/pdf/10.1002/pmj.20082
  9. Flyvbjerg, B. (2014). What You Should Know About Megaprojects and Why: An Overview. Prjoect Management Journal 45(2), 6-19. Retrieved on February 12th from https://journals-sagepub-com.proxy.findit.dtu.dk/doi/abs/10.1002/pmj.21409
  10. Make, P. and Preston, J. (1998). Twenty-one sources of error and bias in transport project appraisal. Transport Policy, 5(1), 1-7. Retrieved on February 10th from https://www.sciencedirect.com/science/article/pii/S0967070X98000043
  11. Shore, B. (2008). Systematic Biases and Culture in Project Failures. Project Management Journal, 39(4), 5–16. Retrieved on February 14th 2021 from https://journals.sagepub.com/doi/pdf/10.1002/pmj.20082
  12. Purvis, R. L., McCray, G. E. and Roberts, T. L. (2004). Heuristics and Biases in Information Systems Project Management, Engineering Management Journal, 16(2), 19-27. Retrieved on February 14th from https://www-tandfonline-com.proxy.findit.dtu.dk/doi/pdf/10.1080/10429247.2004.11415245?needAccess=true
  13. 13.0 13.1 Flyvbjerg, B. (2006). Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice. European Planning Studies, 16(1), 3-21. Retrieved on February 15th 2021 from https://www-tandfonline-com.proxy.findit.dtu.dk/doi/full/10.1080/09654310701747936
  14. Themsen, T. N. (2019). The processes of public megaproject cost estimation: The inaccuracy of reference class forecasting. Financial Accountability and Management, 35(4), 337-352. Retrieved on February 17th 2021 from https://onlinelibrary.wiley.com/doi/full/10.1111/faam.12210?casa_token=CL_6AsSd0xMAAAAA%3AcAI44_N_LdQKMR_GSb0xH7OFo_-JLzl1KbevboJSGMqBpBFwZdQ_2PagUKMTxZD2Ksitur9dI9oSFCs
Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox