Abstract.The growing relevance of data quality has revealed the need for adequate measurement. As time aspects are extremely important in data quality management, we propose a novel approach to assess data currency. Our metric, which is founded on probability theory, enables an objective and widely automated assessment for data liable to temporal decline. Its values are easy to interpret by business users. Moreover, the metric makes it possible to analyse the economic impacts of data quality measures like data cleansing and can therefore build a basis for an economic management of data quality. The approach can be applied in various fields of application where the currency of data is important. To illustrate the practical benefit and the applicability of the novel metric, we provide an extensive realworld example. In cooperation with a major German mobile services provider, the approach was successfully applied in campaign management and led to an improved decision support.Keywords: data quality; data quality assessment; data quality metrics
IntroductionBoth the benefit and the acceptance of information systems depend heavily on the quality of data provided by these systems [1][2][3]. Executives and employees need high-quality data in order to perform business, innovation, and decision-making processes properly [4,5]. It is, therefore, not surprising that bad data quality may lead to wrong decisions and correspondingly high costs. According to an international survey, 75 percent of all respondents have already made wrong decisions due to incorrect or outdated data. In addition, they and their staff spend up to 30 percent % of their working time on checking the quality of data provided [6]. Here, ensuring completeness, correctness and currency of data -such properties are known as data quality dimensions [7] -still remains an important problem for many companies and public institutions [8][9][10][11][12][13]. But how good is an organization's data quality? To answer this important question, well-founded and applicable metrics are needed [8,14,15]. In addition, assessing data quality (e.g. master data) is essential for analysing the economic effects of bad or improved data quality as well as for planning data quality measures in an economic manner. In this paper, we present a novel metric for currency, as empirical investigations reveal that time aspects are extremely