Data Quality Management

From apppm
Revision as of 20:34, 18 February 2018 by Oliver.amb (Talk | contribs)

Jump to: navigation, search

Contents

Abstract

Data quality management (DQM) serves the objective of continuously improving the quality of data relevant to an organisation, program or project[1]. It is important to understand that the end goal of DQM is not about simply improving data quality in the interest of having high-quality data, but rather to achieve desired outcomes that rely on high-quality data[2]. DQM is is the management of people, processes, technology and data through coordinated activities aimed at directing and controlling a program or project in terms of data quality"[3].

Data quality has a significant impact on both the efficiency and effectiveness on organisations[4]. As part of the digital transformation, data has become more readily available and more important than ever before. Organisations are performing data analytics to leverage key resources and optimise processes to gain a competitive advantage. As such, data is becomingly increasingly valuable to program and project managers who are driving decision making based on data insight. However, if the data quality is poor, managers risk taking misguided decisions based on unreliable data. It is therefore imperative that a proper DQM system is in place to ensure decisions are being driven based on high-quality data. This article explores the fundamentals behind DQM using references to industry best practices and ISO guidelines.

Overview

Data Quality

An important element of DQM is understanding the dimensions and complexity of the term data quality. As per ISO 8000-2 guidelines, data is defined as "reinterpretable representation of information in a formalised manner suitable for communication, interpretation, or processing" while data quality is defined as the "degree to which a set of inherent characteristics of data fulfils requirements"[5]. Data quality is a multifaceted concept which considers various dimensions and it can therefore be difficult to measure data quality[6]. Data quality dimensions in literature consider accuracy, completeness, consistency, integrity, representation, timeliness, uniqueness and validity. The ISO 8000-8 guidelines divide data quality into three categories based on semiotic theory. Semiotic theory concerns the usage of symbols such as letters and numbers to communicate information[7]. The three semiotic categories that are relevant in regard to discussing data quality are; syntactic quality, semantic quality, and pragmatic quality[8]. As illustrated in figure 2.A, these categories provide a base for measuring data quality and are important terms to recognize before establishing a DQM program.

Syntactic Data Quality

Figure 2.A: Overall data quality, inspired from Moody, et al. n.d, Evaluating the quality of process models: empirical analysis of a quality framework
Figure 2.B: Semiotic categories for data quality, inspired from Shanks, G and Darke, P. 1998, Understanding data quality in a data warehouse: a semiotic approach

The goal of syntactic data quality is consistency. Consistency concerns the use of consistent syntax and symbolic representation for particular data. Syntactic quality can be measured based on percentage of inconsistencies in data values. Consistency is often developed through a set of rules concerning syntax for data input[9].

Semantic Data Quality

The goal of semantic data quality is accuracy and comprehensiveness. Accuracy can be defined as the degree of conformity a data element holds compared to the truth of the real world. Comprehensiveness can be understood as the extent to which relevant states in the real world system are represented in a data warehouse[10]. Properties that fall under semantic quality are completeness, unambiguity, meaningfulness and correctness[11].

Pragmatic Data Quality

The goal of pragmatic data quality concerns usability and usefulness. Usability refers to how easy it is for a stakeholder to be able to effectively interact and access the data while usefulness refers to the ability of the data to support a stakeholder in accomplishing tasks and aid decision-making. Data may be more useful/usable for some stakeholders than others, depending on their ability to interpret the data and the context of their tasks. Pragmatic data quality involves the properties of timeliness, conciseness, accessibility, reputability and understood[12].

Figure 2.B highlights the various categories addressed above and summarises each quality category goal, properties and example methods for measuring empirically. There are various empirical models that build on the semiotic theory for categorizing data quality, some of which use different regression and weighting models. These can be studied further in Moody, et al's paper: Evaluating the quality of process models: empirical analysis of a quality framework.

Fundamental Principles of a Data Quality Management Program

Figure 3.A: Functions of a data governance program, inspired from Knowledgent: Building a successful DQM program

Before investigating the principles that make up a DQM program, it is important to recognize that DQM often functions as one of many building blocks of a larger data governance program[13]. Figure 3.A highlights the various functions which make up a data governance program, these include; DQM, data architecture, metadata management, master data management, data distribution, data security, and information lifecycle management. Therefore, DQM does not touch upon these other building blocks of data governance, however, there is often a strong interplay between the different functions. The ISO 8000-61 guidelines define the three fundamentals of a DQM program as: Process, continuous improvement, and involvement of people. The three fundamental principles of DQM act as pillars in building and managing a program for the assurance of data quality.

ISO 8000-61 Principles of a DQM Program

Figure 3.B: The fundamental principles of DQM, inspired from ISO 8000-61
Process Approach

The first fundamental principle is the process approach, this principle concerns defining and operating the processes that use, create and update relevant data[14]. This principle states that a successful DQM program requires a process approach to managing key process activities. The process approach involves defining and operating recurring and reliable processes to support the DQM program.

Continuous Improvement

The principle of continuous improvement forms the second fundamental, this principle establishes the idea that data most be constantly improved through effective measurement, remediation and corrective action of data nonconformities. As stated by the ISO 8000-61 guidelines, continuous improvement depends on "analysing, tracing and removing the root causes of poor data quality" which may require adjustments to faulty processes[15].

Involvement of People

The third fundamental principle highlights the importance of people to a DQM program. This principle states that different responsibilities are allocated to individuals at different levels within a program or organisation. Top level management provide necessary and sufficient resources and direct the program towards achieving specific goals in regard to data quality. Data specialist perform activities such as implementation of processes, intervention, control and the embedding of future processes for continuous improvement. While end users perform direct data processing activities such as input of data and analysis. End users typically have the greatest direct influence on actual data quality as these are also the individuals in closest contact to the data itself[16].

ISO 8000-61 Framework for the DQM Process

The Basic Structure of the DQM Process

Figure 4.A: Basic DQM structure, inspired from ISO 8000-61

The basic structure of the DQM process is illustrated in figure 4.A. The structure is illustrated by three overarching and interlinked processes; implementation, data-related support, and resource provision.

Implementation Process: this stage is aimed at achieving continual improvement of data quality using a systematic plan-do-check-act cycle.

Data-Related Support: this stage enables the implementation process by providing information and technology related to data management. As the figure 4.A highlights, it provides an input to the implementation stage.

Resource Provision: this stage involves training of individuals performing data related tasks and providing sufficient resources to effectively and efficiently manage the implementation and data-related support processes.

Detailed Structure of the DQM Process

Implementation

Plan Do Check Act

Data-Related Support

Resource Provision

Glossary

DQM: Data Quality Management //ISO: International Organisation for Standardization

Bibliography

Batini, C. and Scannapieco, M. (2006): Data Quality: Concepts, Methodologies and Techniques. Berlin: Springer. This book explores various concepts, methodologies and techniques involving data quality processes. It provides a solid introduction to the topic of data quality.

ISO 8000-61. (2016): Data Quality - Part 61: Data Quality Management: Process Reference Model. International Organisation for Standardisation. Ref: ISO 8000-61:2016(E). This is the international standard for data quality management processes. It provides an excellent and concise overview of the industry best practices regarding DQM processes, explaining the fundamental principles behind DQM and elaborating on process procedures through a framework guide.

Knowledgent (2014): Building a Successful DQM Program. Knowledgent White Paper Series. This paper provides an introduction to DQM within enterprise information management, explaining the basic concepts behind DQM and also explaining the data quality cycle framework.

Shanks, G. and Darke, P. (1998): Understanding Data Quality in a Data Warehouse: A Semiotic Approach. Massachusetts USA: University of Massachusetts Lowell, pg. 292-309. This paper provided an overview of data quality measures using a semiotic approach, explaining each semiotic level and how they are interlinked to data quality. The semiotic theory discussed is similar to the one later adopted by the ISO 8000-8 standard for data quality.

References

  1. Pg. 3, 2014 ed. Building a Successful Data Quality Management Program, Knowledgent
  2. Pg. 3, 2014 ed. Building a Successful Data Quality Management Program, Knowledgent
  3. 2017 ed. ISO 8000-2: Data Quality - Part 2: Vocabulary, ISO
  4. Pg. 2, 2006 ed. Data Quality: Concepts, methodologies and Techniques, Carlo Batini & Monica Scannapieca
  5. 2017 ed. ISO 8000-2: Data Quality - Part 2: Vocabulary, ISO
  6. Pg. 6, 2006 ed. Data Quality: Concepts, methodologies and Techniques, Carlo Batini & Monica Scannapieca
  7. Pg. 298, 1998. Understanding Data Quality in a Data Warehouse: A Semiotic Approach, Shanks, G and Darke, P.
  8. 2015 ed. ISO 8000-8: Data Quality - Part 8: Information and data quality: Concepts and measuring, ISO
  9. Pg. 303, 1998. Understanding Data Quality in a Data Warehouse: A Semiotic Approach, Shanks, G and Darke, P.
  10. Pg. 301, 1998. Understanding Data Quality in a Data Warehouse: A Semiotic Approach, Shanks, G and Darke, P.
  11. Pg. 303, 1998. Understanding Data Quality in a Data Warehouse: A Semiotic Approach, Shanks, G and Darke, P.
  12. Pg. 302, 1998. Understanding Data Quality in a Data Warehouse: A Semiotic Approach, Shanks, G and Darke, P.
  13. Pg. 3, 2014 ed. Building a Successful Data Quality Management Program, Knowledgent
  14. Pg. 2, 2015 ed. ISO 8000-61: Data quality management: Process reference model, ISO
  15. Pg. 2, 2015 ed. ISO 8000-61: Data quality management: Process reference model, ISO
  16. Pg. 2, 2015 ed. ISO 8000-61: Data quality management: Process reference model, ISO
Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox