Targeting of Data Quality Monitoring: Which Subset of Business Master Data is Critical?
Penttinen-Santos Da Silva, Tanja (2014)
Penttinen-Santos Da Silva, Tanja
HAAGA-HELIA ammattikorkeakoulu
2014
All rights reserved
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:amk-2014102714957
https://urn.fi/URN:NBN:fi:amk-2014102714957
Tiivistelmä
This thesis forms a framework that does not currently exist for identifying the most critical data elements for data quality monitoring and a construct to represent these data. It was commissioned by an information management company to support the setup process of their data quality monitoring tool in a customer environment.
The thesis hypothesis is that it is not viable to aim for 100% data quality on all of an organization’s data. Instead the subset of data that is the most critical and offers the most benefit if of good quality should be targeted. The thesis suggests data quality monitoring as the means for data quality improvement. The main objective of the thesis is to define a generic subset of business master data that is most critical for data quality monitoring for most organizations regardless of industry.
The introduction and theoretical parts of the thesis build a big picture for the reader on the importance of data and their quality as well as introduce data quality monitoring. The main linkage between the theoretical and empirical parts of the thesis is the chapter explaining the connection between data quality and business processes. This also introduces the logic used in the thesis for defining in which business/data intersections critical data lie.
The empirical research was conducted in the summer of 2014. A preliminary data quality monitoring targeting construct was built based on the theoretical research and commissioning party representatives’ years of experience with data quality issues faced by organizations. Thematic interviews were conducted with data quality experts to verify/challenge the preliminary construct. Interviews were analysed to realign the construct.
As a conclusion, the final data quality monitoring targeting construct is introduced with recommendations for possible further development. The construct will be utilized as a basis for setting up organization-specific data quality monitoring. Conclusions also include additional approaches for identifying critical data.
The thesis hypothesis is that it is not viable to aim for 100% data quality on all of an organization’s data. Instead the subset of data that is the most critical and offers the most benefit if of good quality should be targeted. The thesis suggests data quality monitoring as the means for data quality improvement. The main objective of the thesis is to define a generic subset of business master data that is most critical for data quality monitoring for most organizations regardless of industry.
The introduction and theoretical parts of the thesis build a big picture for the reader on the importance of data and their quality as well as introduce data quality monitoring. The main linkage between the theoretical and empirical parts of the thesis is the chapter explaining the connection between data quality and business processes. This also introduces the logic used in the thesis for defining in which business/data intersections critical data lie.
The empirical research was conducted in the summer of 2014. A preliminary data quality monitoring targeting construct was built based on the theoretical research and commissioning party representatives’ years of experience with data quality issues faced by organizations. Thematic interviews were conducted with data quality experts to verify/challenge the preliminary construct. Interviews were analysed to realign the construct.
As a conclusion, the final data quality monitoring targeting construct is introduced with recommendations for possible further development. The construct will be utilized as a basis for setting up organization-specific data quality monitoring. Conclusions also include additional approaches for identifying critical data.