Robust Decision Making: better decisions under uncertainty
Contents |
Abstract
Robust Decision Making (RDM) involves a set of ideas, methods, and tools that employ computation to facilitate better decision-making when dealing with situations of significant uncertainty. It integrates Decision Analysis, Assumption-Based Planning, Scenario Analysis, and Exploratory Modelling to simulate multiple possible outcomes in the future, with the aim of identifying policy-relevant scenarios and robust adaptive strategies. These RDM analytic tools are frequently embedded in a decision support process referred to as "deliberation with analysis," which fosters learning and agreement among stakeholders [1]. This article provides a review of the current state of the art in RDM in project management, including the key principles and practices of RDM, such as the importance of data gathering and analysis, considering different options, and involving stakeholders. Furthermore, this article examines the benefits, challenges, and limitations of RDM in project management and provides insights into future directions for research in this area. Its aim is to provide project managers with a deeper understanding of the principles and practices of RDM, along with insights on and example of how to correctly implement RDM in project management. Ultimately, this article aims to contribute to the development of more effective and efficient approaches to project management and decision making by promoting the use of RDM in project management.
Big Idea: Robust Decision Making under uncertainty
Brief history
Robust Decision Making (RDM) emerged in the 1950s and 1960s, when the RAND Corporation developed a framework to evaluate the effectiveness of nuclear weapon systems [2] [3]. The approach was designed to address uncertainty and ambiguity inherent in strategic planning, and it evolved to include simulation techniques, sensitivity analysis, and real options analysis. In the 1990s and 2000s, as the complexity and uncertainty of projects increased, RDM gained wider acceptance in project management and was applied to fields such as infrastructure, software development, and environmental management. Today, RDM is an established approach in project management, recognized for its ability to help project managers make well-informed and confident decisions, anticipate, and manage uncertainty, and continuously adapt and monitor. RDM has also been applied in various contexts beyond project management, such as climate change policy and disaster risk reduction [4] [5] [6] [7].
Literature review
TO SHORTEN BY A LOT
According to Donald Rumsfeld there are different types of knowledge: known knowns, known unknowns, and unknown unknowns. Known knowns refer to things that we know for sure. Known unknowns refer to things that we know we do not know. However, the most challenging category is the unknown unknowns, which refers to things that we do not know we do not know [8] [9]. Knight further elaborates and proposes a distinction between risk and uncertainty. The former indicates situations in which the unknown can be measured (through probabilities) and, therefore, controlled. The latter indicates situations in which the unknown can't be quantified and can't, therefore, be measured [10]. Based on Knight distinction, academics differentiated the various levels of uncertainty in decision-making, ranging from complete certainty to total ignorance [11] [12] [3]. These levels are categorized based on the knowledge assumed about various aspects of a problem, including the future state of the world, the model of the relevant system, the outcomes from the system, and the weights that the various stakeholders will put on the outcomes. The four intermediate levels of uncertainty are defined as Level 1, where historical data can be used as predictors of the future [13]; Level 2, where probability and statistics can be used to solve problems; Level 3, where plausible future worlds are specified through scenario analysis; and Level 4, where the decision maker only knows that nothing can be known due to unpredictable events or lack of knowledge or data [14] [15]. It is believed that with issues dealing with a greater level of uncertainty (Level 4), a more sophisticated and in-depth data gathering is not helpful. The decision-making process in such situations is defined as decision making under deep uncertainty (DMDU) [3]. Instead of a "predict and act" paradigm, which attempts to anticipate the future and act on that prediction, DMDU approaches are based on a "monitor and adapt" paradigm, which aims to prepare for unknown occurrences and adjust accordingly [12]. In order to make decisions for unpredictable occurrences and long-term changes, this "monitor and adapt" paradigm "explicitly identifies the deep uncertainty surrounding decision making and underlines the necessity to take this deep uncertainty into consideration ([1], p. 11)." This article explores RDM under uncertainty, an approach dwelling under the realm of DMDU methodologies.
According to the "monitor and adapt" paradigm, RDM refers to a collection of ideas, procedures, and supportive technologies intended to rethink the function of quantitative models and data in guiding choices in situations affected by uncertainty. Models and data become tools for systematically exploring the consequences of assumptions, expanding the range of futures considered, creating innovative new responses to threats and opportunities, and sorting through a variety of scenarios, options, objectives, and problem framings to identify the most crucial trade-offs confronting decision makers. This is in contrast to the traditional view of models as tools for prediction and the subsequent prescriptive ranking of decision options. This means that, rather than improving forecasts, models and data are used to facilitate decision makers in taking robust decisions [16]. As argued by Marchau et. al., robustness of decisions is, therefore, guaranteed by iterating several times the solution to a problem while straining the suggested decisions against a wide variety of potential scenarios. In doing so, RDM endure the decision-making process under deep uncertainty [3].
Although in the literature several examples of practical applications of RDM in project management can be found, the theoretical support of the application of this framework in project management practices is still poor. The remainder of the article will, therefore, concentrate on the fundamental principles of RDM, guide the reader through the methodology, give an illustration of how RDM has been successfully used in a large-scale project, and discuss benefits and limitation of the approach.
Foundations of Robust Decision Making
RDM finds its grounds in four key notions, from which it both takes some legacy, and offers a fresh expression. These are Decision Analysis, Assumption-Based Planning, Scenario Analysis, and Exploratory Modelling.
Decision Analysis (DA)
The discipline of DA provides a framework for creating and utilizing well-structured decision aids. RDM exploits this framework but focuses specifically on finding trade-offs and describing vulnerabilities to create robust decisions based on stress testing of probable future routes. Both DA and RDM seek to improve the decision-making process by being clear about goals, utilizing the finest information available, carefully weighing trade-offs, and adhering to established standards and conventions to assure legitimacy for all parties involved. However, while DA seeks optimality through utility frameworks and assumptions [17], RDM seeks robustness assuming uncertainty as deep, probabilities as imprecise [18], and highlighting trade-offs between plausible options.
Assumption-Based Planning (ABP)
By expanding awareness of how and why things could fail, RDM uses the ideas of stress testing and red teaming to lessen the harmful impacts of overconfidence in current plans and processes [19]. This approach was first implemented in the so-called Assumption-Based Planning (ABP) framework. Starting with a written version of an organization's plans, ABP finds the explicit and implicit assumptions made during the formulation of that plan that, if untrue, would result in its failure. These sensitive assumptions can be identified by planners, who can then create backup plans and "hedging" strategies in case the others start to crumble. ABP takes, then, into account "signposts", which refers to monitoring patterns and events to spot any faltering presumptions [1].
Scenario Analysis (SA)
In order to deal with deep uncertainty, RDM builds upon the idea of SA [20]. Scenarios are defined as collections of potential future occurrences that illustrate various worldviews without explicitly assigning a relative likelihood score [21]. They are frequently employed in deliberative processes involving stakeholders and not including probabilities of occurrence. This is done with the objective of broadening the range of futures taken into consideration and to communicating a wide variety of futures to audiences. As a legacy from SA, RDM divides knowledge about the future into a limited number of unique situations to aid in the exploration and communication of profound ambiguity.
Exploratory Modeling (EM)
According to Bankes, it is Exploratory Modeling (EM) the tool that allows the integration of DA, ABP, and SA in RDM [22]. Without prioritizing one set of assumptions over another, EM maps a wide range of assumptions onto its results. This means that EM is strongly beneficial when a single model cannot be validated because of a lack of evidence, insufficient or conflicting ideas, or unknown futures. Therefore, by lowering the demands for analytic tractability on the models employed in the study, EM offers a quantitative framework for stress testing and scenario analysis and allows the exploration of futures and strategies. As EM favours no base case or one future as an anchor point, it allows for genuinely global sensitivity studies.
Application
Theoretical framework RDM is a learning process based on the so-called “deliberation with analysis”. The framework requires that the decision-making parties discuss their goals and alternatives, analysts use system models to provide information that is important to the decision, and then the parties review their goals, choices, and issue framing in light of the quantitative data. This is especially recommended in settings where there are a variety of decision-makers, who must make decisions in ever-changing environments, and whose objectives may change as a result of their collaboration with others. [23]
> PICTURE HERE!!!!!! FIGURE 1
As illustrated in (THE IMAGE ABOVE) RDM methodology follows 5 major steps, described in the following paragraphs.
Step1: Decision Framing. The RDM process starts with a decision framing workshop in which stakeholders brainstorm and define the key factors in the analysis. These include decision-makers’ goals and criteria, the potential courses of action they may choose to accomplish those goals, the uncertainties that might impact the link between actions and results, and the connections between actions, uncertainties, and goals. Once this information has been gathered, it is put into a framework known as “XLRM” [20] [24], where:
- “X” stands for exogenous variables (factors not under the control of the decision makers)
- “L” stands for policy levers (policies that affect the system to achieve goals)
- “R” stands for relationships (relevant variables needed to correctly evaluate and benchmark policies)
- “M” stands for measures of performance (metrics, not necessarily quantitative, given from stakeholders to evaluate policies)
The output is a set of potential robust strategies.
Step 2: Evaluate strategies. According to the ABP approach, RDM then exploits simulation models to assess the proposed strategies of Step 1 in each of many plausible paths into the future. This process of generating strategies may use a variety of techniques, spanning from optimization methods to public debate [25]. It is commonly observed, however, that strategy evaluation usually combines them all [26].
Step 3: Vulnerability analysis. Data analytics and visualization techniques are then used to search for and describe vulnerabilities of the strategies under consideration. Specifically, statistical methods are used to find the critical variables that best distinguish futures in which these strategies succeed or fail. [27]. The output of this step is a multitude of scenarios which are then clustered based on the identified vulnerabilities.
Step 4: Trade-off analysis. The scenarios generated underlie the evaluation of trade-offs between strategies. This step is useful to give insights on how the future would look like if a strategy was chosen.
Step 5: New futures and strategies. The result from Step 4 ultimately aids in identifying and appraising alternative solutions, eventually allowing to narrow down the most robust, or propose more robust ones. Sometimes the identification and appraisal process rely on experts’ opinions [26] [20]. Some other times optimization approaches are used instead [28]. It is either when no more robust strategies can be generated, or when the already identified ones are considered sufficiently satisfactory that the procedure ends.
To appraise and quantify trade-offs between strategies, RDM exploits both absolute and relative performance indicators. Specifically, the formers are beneficial when specific objectives would like to be met (e.g., maximisation of profit). The latter are beneficial when decision-makers seek the evaluation of different strategies in different possible futures and search for the most robust ones.
Example: Application to Water Planning
Introduction and background. The Colorado River is the main source of water in the southwestern United States, supplying 4.5 million acres of agriculture with irrigation as well as power and water to about 40 million people [29]. Four Upper Basin States (Colorado, New Mexico, Utah, and Wyoming) and three Lower Basin States (Arizona, California, and Nevada) each gets 15 million acre-feet of water under the terms of the 1922 Colorado River Compact. The system's dependability is being put under more and more pressure because to the deep supply uncertainty and rising demand. The Colorado River Basin Study was started in 2010 by the seven Basin States and the US Bureau of Reclamation to:
- ensure a 10-years running water flow from the Upper to the Lower Basin with a minimum of 7.5 maf/year (Upper Basin reliability),
- maintain Lake Mead at a minimum of 1000 feet of pool elevation (Lower Basin reliability).
Reclamation evaluated DMDU methodologies in a pilot study, adopted them to assist ongoing planning, and utilized RDM to frame the vulnerability and adaptation evaluations for the Basin Study. The research results were used to specify a solid, flexible management plan.
Decision framing and current vulnerability analysis. As illustrated in Figure 2, XLRM was used as the tool to evaluate a total of 23,508 different futures. To retrieve this figure, the Basin Study analysed a plethora of future hydrologic conditions in conjunction with six demand scenarios and two operating scenarios. Colorado River Simulation System (CRSS), a long-term planning tool for Reclamation, was used to assess the system's performance across a wide range of potential futures. These studies concentrated on two major goals: keeping Lake Mead's pool elevation above 1,000 feet and ensuring that the water flow from the Upper to Lower Basin reaches or surpasses 7.5 maf per year as measured over a period of ten years. In addition, CRSS employed alternative water management techniques such as desalination, wastewater reuse, municipal, industrial, and agricultural conservation.
Vulnerability analysis was done in the decision framing step. The current water management system was simulated across thousands of different scenarios. SA approaches were, then, used to identify significant vulnerabilities and CRSS to model Basin outcomes. If the long-term average streamflow falls below 15 maf and an eight-year drought occurs with average flows below 13 maf, the Lower Basin is at risk. The Basin Research also discovered a vulnerability for the Upper Basin characterized by streamflow traits that are only projected to occur in the future with declining supply.
> PICTURE HERE!!!!!! FIGURE 2
Based on the above framework, the project team created portfolios of individual management choices that might either boost the supply or decrease demand for the Basin states. Stakeholders developed four portfolios, each with a unique set of investment possibilities. Future vulnerability management was represented by Portfolio B (Reliability Focus) and Portfolio C (Environmental Performance Focus). Portfolio D (Common Options) was established to contain just those choices in both Portfolios B and C, while Portfolio A (Inclusive) was defined to include all alternatives in either Portfolios B or C. Options were ranked in order of cost-effectiveness, which was calculated by dividing the average yearly yield by the total project cost. By modelling the investment choices that a basin manager would take under various simulated Basin conditions, CRSS modelled these portfolios as potential strategies.
Evaluate strategies. Robustness was defined as the strategy that minimizes regret under a plethora of conceivable future scenarios. In this case, regret indicates the additional water supply needed to keep Lake Mead at it’ minimum required level. The more the supply, the more the regret. It is worth noting that regret is not eliminated. In fact, throughout the simulation, it was observed that some strategies pursued big investments in the wettest futures and little to no investments in the driest ones.
Trade-off analysis. Each technique performs differently in terms of cost and dependability measure. In futures where the supply is lower it was observed that Portfolio A was the strategy with the highest likelihood to prevent water delivery vulnerability, at the expense of the highest cost. Portfolio D (a subset of A) had lower costs, but also lower probability to prevent vulnerabilities. It was concluded that Portfolios A and B were associated to lowest number of year with critical water levels, but the highest costs. Portfolio C had slightly more years with critical water levels, but lower costs.
New futures and strategies. Probability thresholds that, with sufficient certainty, would indicate that a given vulnerability would be likely to occur, and that a consequence management action should be taken, were determined based on statistical analysis and the primary vulnerabilities found. This data was utilized to create an effective plan for the whole Basin that directs the investment of greater water supply yield and demand reductions. The sample paths (dashed lines) in the picture 3 illustrate one of many potential implementation routes. In the event that the basin was to follow a trajectory that was consistent with the vulnerability “Below Historical Streamflow During Extreme Drought”, the example paths illustrate how basin managers would provide additional supply. The figure shows decisions taken from Basin managers up to 2030. It then highlights a decision point in 2030, where, by evaluating scenarios, it might be possible that the future will not be consistent with the same vulnerability. If conditions are consistent with the “Severe Declining Supply Scenario”, then the Basing managers should increase the net supply to more than 3.6 maf between 2031-2040. The same decision approach is taken for the following years.
> PICTURE HERE!!!!!! FIGURE 3
Main Benefits & Limitations
When used with project management frameworks, RDM can offer a number of advantages. First, RDM may help project managers make informed and data-driven decisions. It can assist managers in identifying more reliable and adaptable plans by considering a variety of uncertainties and their possible impacts. Second, RDM can help project managers identify and assess various risks associated with a project, including both known and unknown risks. This can enable managers to develop contingency plans and other risk mitigation strategies to address potential issues. Finally, RDM can facilitate stakeholder engagement and participation in the decision-making process. By considering the perspectives and preferences of various stakeholders, RDM can help managers develop solutions that are more acceptable and feasible. However, there are also some limitations to using RDM in a project management framework. First, RDM can be resource-intensive, requiring significant data collection, analysis, and modeling. This can be particularly challenging for smaller projects or those with limited resources. Second, complexity and uncertainty can make it challenging to apply RDM effectively, particularly in cases where there are significant data gaps or limited information available. Third, models and simulations are only as good as the data and assumptions that underlie them. This can lead to errors or biases in the decision-making process.
References
- ↑ 1.0 1.1 1.2 Vincent A. W. J. Marchau, Warren E. Walker, Pieter J. T. M. Bloemen, Steven W. Popper (2019). Decision Making under Deep Uncertainty. From Theory to Practice
- ↑ https://www.rand.org/pardee/methods-and-tools/robust-decision-making.html
- ↑ 3.0 3.1 3.2 3.3 Lempert, R., J. (2019). Robust Decision Making (RDM), in Decision Making Under Deep Uncertainty — 2019, pp. 23-51
- ↑ Lempert, R. J., & Collins, M. T. (2007). Managing the risk of uncertain threshold responses: Comparison of robust, optimum, and precautionary approaches. Risk Analysis, 27(4), 1009-1026.
- ↑ Ramanathan, R., & Ganesh, L. S. (1994). Group preference aggregation methods employed in AHP: An evaluation and an intrinsic process for deriving members’ weightages. European Journal of Operational Research, 79(2), 249-265.
- ↑ Whang, J., & Han, S. (2009). Optimal R&D investment strategies under uncertainty for the development of new technologies. Journal of Business Research, 62(4), 441-447.
- ↑ Xu, Q., Zhang, L., & Zhang, X. (2013). The application of robust decision-making in the emergency evacuation of large-scale events. Safety Science, 57, 141-146.
- ↑ Donald Rumsfeld, Department of Defense News Briefing, February 12, 2002.
- ↑ "The Johari Window", http://wiki.doing-projects.org/index.php/The_Johari_Window, 27 February 2021
- ↑ Knight, F. H. (1921). Risk, uncertainty and profit. New York: Houghton Mifflin Company (repub- lished in 2006 by Dover Publications, Inc., Mineola, N.Y.).
- ↑ Courtney, H. (2001). 20/20 foresight: Crafting strategy in an uncertain world. Boston: Harvard Business School Press.
- ↑ 12.0 12.1 Walker, W.E., Marchau, V.A.W.J., Kwakkel, J.H., 2013. Uncertainty in the framework of policy analysis, in: Walker, W.E., Thissen, W.A.H. (Eds.), Public Policy Analysis: New Developments.
- ↑ Hillier, F. S., & Lieberman, G. J. (2001). Introduction to operations research (7th ed.). New York: McGraw Hill.
- ↑ Taleb, N. N. (2007). The black swan: The impact of the highly improbable. New York: Random House.
- ↑ Schwartz, P. (1996). The art of the long view: Paths to strategic insight for yourself and your company. New York: Currency Doubleday.
- ↑ Popper, S. W., Lempert, R. J., & Bankes, S. C. (2005). Shaping the future. Scientific American, 292(4), 66–71.
- ↑ Morgan, M. G., & Henrion, M. (1990). Uncertainty: A guide to dealing with uncertainty in quantitative risk and policy analysis. Cambridge, UK: Cambridge University Press.
- ↑ Walley, P. (1991). Statistical reasoning with imprecise probabilities. London: Chapman and Hall.
- ↑ Dewar, J. A., Builder, C. H., Hix, W. M., & Levin, M. H. (1993). Assumption-based planning—A planning tool for very uncertain times. Santa Monica, CA, RAND Corporation. https://www.rand.org/pubs/monograph_reports/MR114.html. Retrieved July 20, 2018.
- ↑ 20.0 20.1 20.2 Lempert, R. J., Popper, S. W., & Bankes, S. C. (2003). Shaping the Next One Hundred Years: New Methods for Quantitative, Long-term Policy Analysis. Santa Monica, CA, RAND Corporation, MR-1626-RPC.
- ↑ Wack, P. (1985). The gentle art of reperceiving—scenarios: Uncharted waters ahead (part 1 of a two-part article). Harvard Business Review (September–October): 73–89.
- ↑ Bankes, S. C. (1993). Exploratory modeling for policy analysis. Operations Research, 41(3), 435–449.
- ↑ National Research Council (NRC) (2009). Informing decisions in a changing climate. National Academies Press.
- ↑ Jan H. Kwakkel, The Exploratory Modeling Workbench: An open source toolkit for exploratory modeling, scenario discovery, and (multi-objective) robust decision making, Environmental Modelling & Software, Volume 96, 2017, Pages 239-250, ISSN 1364-8152, https://doi.org/10.1016/j.envsoft.2017.06.054. (https://www.sciencedirect.com/science/article/pii/S1364815217301251)
- ↑ Hall, J. M., Lempert, R. J., Keller, K., Hackbarth, A., Mijere, C., & McInerney, D. (2012). Robust Climate Policies under uncertainty: A comparison of Info-Gap and RDM methods. Risk Analysis, 32(10), 1657–1672.
- ↑ 26.0 26.1 Popper, S. W., Berrebi, C., Griffin, J., Light, T., Min, E. Y., & Crane, K. (2009). Natural gas and Israel’s energy future: Near-term decisions from a strategic perspective. Santa Monica, CA, RAND Corporation, MG-927.
- ↑ Lempert, R. J. (2013). Scenarios that illuminate vulnerabilities and robust responses. Climatic Change, 117, 627–646.
- ↑ Lempert, R. J., Groves, D. G., Popper, S. W., & Bankes, S. C. (2006). A general, analytic method for generating robust strategies and narrative scenarios. Management Science, 52(4), 514–528.
- ↑ Bureau of Reclamation. (2012). Colorado River Basin water supply and demand study: study report United States Bureau of Reclamation (Ed.). Retrieved July 11, 2018 from http://www.usbr.gov/ lc/region/programs/crbstudy/finalreport/studyrpt.html.
Cite error: <ref>
tag with name "Groves_2013" defined in <references>
is not used in prior text.