This article investigates digital and non-digital traces reused beyond the context of creation. A central idea of this article is that no (reused) dataset is perfect. Therefore, data quality assessment becomes essential to determine if a given dataset is “good enough” to be used to fulfill the users’ goals. Biases, a possible source of discrimination, have become a relevant data challenge. Consequently, it is appropriate to analyze whether quality assessment indicators provide information on potential biases in the dataset. We use examples representing two opposing sides regarding data access to reflect on the relationship between quality and bias. First, the European Union open data portal fosters the democratization of data and expects users to manipulate the databases directly to perform their analyses. Second, online behavioral advertising systems offer individualized promotional services but do not share the datasets supporting their design. Quality assessment is socially constructed, as there is not a universal definition but a set of quality dimensions, which might change for each professional context. From the users’ perspective, trust/credibility stands out as a relevant quality dimension in the two analyzed cases. Results show that quality indicators (whatever they are) provide limited information on potential biases. We suggest that data literacy is most needed among both open data users and clients of behavioral advertising systems. Notably, users must (be able to) understand the limitations of datasets for an optimal and bias-free interpretation of results and decision-making.