Robust Decision Making: better decisions under uncertainty
Contents |
Abstract
Robust Decision Making (RDM) involves a set of ideas, methods, and tools that employ computation to facilitate better decision-making when dealing with situations of significant uncertainty. It integrates Decision Analysis, Assumption-Based Planning, Scenario Analysis, and Exploratory Modelling to simulate multiple possible outcomes in the future, with the aim of identifying policy-relevant scenarios and robust adaptive strategies. These RDM analytic tools are frequently embedded in a decision support process referred to as "deliberation with analysis," which fosters learning and agreement among stakeholders [1]. This article provides a review of the current state of the art in RDM in project management, including the key principles and practices of RDM, such as the importance of data gathering and analysis, considering different options, and involving stakeholders. Furthermore, this article examines the benefits, challenges, and limitations of RDM in project management and provides insights into future directions for research in this area. Its aim is to provide project managers with a deeper understanding of the principles and practices of RDM, along with insights on and example of how to correctly implement RDM in project management. Ultimately, this article aims to contribute to the development of more effective and efficient approaches to project management and decision making by promoting the use of RDM in project management.
Conceptualising Robust Decision Making at times of Uncertainty
Origins
Robust Decision Making (RDM) emerged in the 1980s, when analysts of the RAND Corporation, a California-based think tank affiliated with the U.S. Government, developed a framework to evaluate the effectiveness of nuclear weapon systems [2] [3]. Designed to mitigate the uncertainty and ambiguity experienced by U.S. Government officials involved in the planning and implementation of nuclear deterrence strategies, RDM included simulation techniques, sensitivity analysis, and real options analysis. In the 1990s and 2000s, RDM received increasing interest from private companies interested in exploring new project management techniques applicable to a wide range of industries, including construction, software development, and environmental management. Today, RDM is an established approach in project management, recognized for its ability to help project managers making well-informed and timely decisions under pressure ad at times of uncertainty.
Literature review
TO SHORTEN BY A LOT AND TO NARROW DOWN TO RDM
According to former United States Secretary of Defence Donald Rumsfeld, there are different types of knowledge: known knowns, known unknowns, and unknown unknowns. Known knowns refer to things that we know for sure. Known unknowns refer to things that we know we do not know. However, the most challenging category is the unknown unknowns, which refers to things that we do not know we do not know [4] [5]. Knight further elaborates on this concept and proposes a distinction between risk and uncertainty. The former indicates situations in which the unknown can be measured (through probabilities) and, therefore, controlled. The latter indicates situations in which the unknown can't be quantified and can't, therefore, be measured [6]. Based on Knight’s distinction, academics categorised the various levels of uncertainty in decision-making, ranging from complete certainty to total ignorance [7] [8] [3]. These levels are categorized based on the knowledge possessed about various aspects of a problem, including the future state of the world, the model of the relevant system, the outcomes from the system, and the weights that the various stakeholders will put on the outcomes. The four intermediate levels of uncertainty are defined as Level 1, where historical data can be used as predictors of the future [9]; Level 2, where probability and statistics can be used to solve problems; Level 3, where plausible future worlds are specified through scenario analysis; and Level 4, where the decision maker only knows that nothing can be known due to unpredictable events,lack of knowledge, or unavailability of data [10] [11]. When dealing with issues distinguished by greater level of uncertainty (Level 4), a more sophisticated and in-depth data gathering is often unhelpful. The decision-making process in such situations is defined as decision making under deep uncertainty (DMDU) [3]. Instead of a "predict and act" paradigm, which attempts to anticipate potential future problems and act on that prediction, DMDU approaches are based on a "monitor and adapt" paradigm, which places more emphasis on efforts aimed at preparing for unknown occurrences and adjust accordingly [8]. In order to make decisions for unpredictable occurrences and long-term changes, this "monitor and adapt" paradigm "explicitly identifies the deep uncertainty surrounding decision making and underlines the necessity to take this deep uncertainty into consideration” ([1], p. 11). This article explores RDM under uncertainty, an approach dwelling under the realm of DMDU methodologies.
According to the "monitor and adapt" paradigm, RDM refers to a collection of ideas, procedures, and supportive technologies intended to rethink the function of quantitative models and data in guiding choices in situations affected by uncertainty. Models and data become tools for systematically exploring the consequences of assumptions, expanding the range of futures considered, creating innovative new responses to threats and opportunities, and sorting through a variety of scenarios, options, objectives, and problem framings to identify the most crucial trade-offs confronting decision makers. This contrasts with the traditional view of models as tools for prediction and the subsequent prescriptive ranking of decision options. This means that, rather than improving forecasts, models and data are used to facilitate decision makers in taking robust decisions [12]. As argued by Marchau et. al., robustness of decisions is, therefore, guaranteed by iterating several times the solution to a problem while straining the suggested decisions against a wide variety of potential scenarios. In doing so, RDM endure the decision-making process under deep uncertainty [3].
Although scholars widely explored the practicalapplications of RDM in project management, the theoretical support of the application of this framework in project management practices remains largely unexplored. The remainder of the article will, therefore, concentrate on the fundamental principles of RDM, guide the reader through the methodology, give an illustration of how RDM has been successfully used in a large-scale project, and discuss benefits and limitation of the approach.
Foundations of Robust Decision Making
RDM finds its grounds in four key notions, from which it both takes some legacy, and offers a fresh expression. These are Decision Analysis, Assumption-Based Planning, Scenario Analysis, and Exploratory Modelling.
Decision Analysis (DA) --- The discipline of DA provides a framework for creating and utilizing well-structured decision aids. RDM exploits this framework by focusing specifically on finding trade-offs and describing vulnerabilities to create robust decisions based on stress testing of probable future routes. Both DA and RDM seek to improve the decision-making process by being clear about goals, utilizing the finest information available, carefully weighing trade-offs, and adhering to established standards and conventions to assure legitimacy for all parties involved. However, while DA seeks optimality through utility frameworks and assumptions [13], RDM seeks robustness assuming uncertainty as unescapable and probabilities as imprecise [14], further highlighting trade-offs between plausible options.
Assumption-Based Planning (ABP) --- By expanding awareness of how and why things could fail, RDM uses the ideas of stress testing and red teaming to lessen the harmful impacts of overconfidence in current plans and processes [15]. This approach was first implemented in the so-called Assumption-Based Planning (ABP) framework. Starting with a written version of an organization's plans, ABP finds the explicit and implicit assumptions made during the formulation of that plan that, if inaccurate, would result in failure. These assumptions are identified by decision-makers, who then create backup plans and "hedging" strategies to be used in case of necessity. ABP takes, then, into account "signposts", which refers to monitoring patterns and events to spot any faltering presumptions [1].
Scenario Analysis (SA) --- In order to deal with deep uncertainty, RDM builds upon the idea of SA [16]. Scenarios are defined as collections of potential future occurrences that illustrate various worldviews without explicitly assigning a relative likelihood score [17]. They are frequently envisioned in deliberative processes involving stakeholders and not including probabilities of occurrence. This is done with the objective of broadening the range of scenarios taken into consideration and facilitate the interactions with a wide variety of futures to audiences. In incorporating analytical techniques borrowed from SA, RDM declines the knowledge about the future into a selected range of potential situations—a technique that assists decision-makers in envisioning future risks and better navigating strategic environments of deep uncertainty.
Exploratory Modeling (EM) --- According to Bankes, Exploratory Modeling (EM) is one of the most appropriate tools that allow the integration of DA, ABP, and SA in RDM [18]. Without prioritizing one set of assumptions over another, EM factors a wide range of assumptions into a limited number of results. In other words,EM is strongly beneficial when a single model cannot be validated because of a lack of evidence, insufficient or conflicting ideas, or unknown futures. Therefore, by lowering the demands for analytic tractability on the models employed in the study, EM offers a quantitative framework for stress testing and scenario analysis and allows the exploration of futures and strategies. As EM favours no base case or one future as an anchor point, it allows for genuinely global, large-N studies in support of the more qualitative methods ingrained in SA approaches
Application
Theoretical framework RDM is a learning process based on a process that the scholarship defines as “deliberation with analysis”. The framework requires that the decision-making parties discuss their goals and alternatives, which are based on assessments provided by analysts , who imagine scenarios and policy options underpinned by their analysis of the quantitative data available. This is especially recommended in settings where there are a variety of decision-makers, who must make decisions in ever-changing environments, and whose objectives may change as a result of their collaboration with others. [19] As illustrated in Figure 1 RDM methodology follows 5 major steps, described in the paragraphs below.
> PICTURE HERE!!!!!! FIGURE 1
Step1: Decision Framing. The RDM process starts with a decision framing workshop in which stakeholders brainstorm and define the key factors in the analysis. These include decision-makers’ goals and criteria, the potential courses of action they may choose to accomplish those goals, the uncertainties that might impact the link between actions and results, and the connections between actions, uncertainties, and goals. Once gathered, this information is put into a framework known as “XLRM” [16] [20], where:
- “X” stands for exogenous variables (factors not under the control of the decision makers)
- “L” stands for policy levers (policies that affect the system to achieve goals)
- “R” stands for relationships (relevant variables needed to correctly evaluate and benchmark policies)
- “M” stands for measures of performance (metrics, not necessarily quantitative, given from stakeholders to evaluate policies)
The output is a set of potential robust strategies.
Step 2: Evaluate strategies. According to the ABP approach, RDM exploits simulation models to assess the proposed strategies of Step 1 in each of many plausible paths into the future. This process of generating strategies may use a variety of techniques, spanning from optimization methods to public debate [21]. It is commonly observed, however, that strategy evaluation usually combines them all [22].
Step 3: Vulnerability analysis. Data analytics and visualization techniques are then used to search for and describe vulnerabilities of the strategies under consideration. Specifically, statistical methods are used to find the critical variables that best distinguish futures in which these strategies succeed or fail. [23]. The output of this step is a multitude of scenarios which are then clustered based on the identified vulnerabilities.
Step 4: Trade-off analysis. The scenarios generated underlie the evaluation of trade-offs between strategies. This step is useful to give insights on how the future would look like if a strategy was chosen.
Step 5: New futures and strategies. The result from Step 4 is necessary to identifying and appraising alternative solutions, allowing to focus on the most robust ones. Sometimes, the identification and appraisal process rely on experts’ opinions [22] Cite error: Closing </ref> missing for <ref> tag
Cite error:
<ref>
tags exist, but no <references/>
tag was found