Background: Precision medicine (PM) is an emerging approach to individualized care. It aims to help physicians better comprehend and predict the needs of their patients while effectively adopting in a timely manner the most suitable treatment by promoting the sharing of health data and the implementation of learning healthcare systems. Alongside its promises, PM also entails the risk of exacerbating healthcare inequalities, in particular between ethnoracial groups. One often-neglected underlying reason why this might happen is the impact of structural racism on PM initiatives. Raising awareness as to how structural racism can influence PM initiatives is paramount to avoid that PM ends up reproducing the pre-existing health inequalities between different ethnoracial groups and contributing to the loss of trust in healthcare by minority groups. Main body: We analyse three nodes of a process flow where structural racism can affect PM's implementation. These are: (i) the collection of biased health data during the initial encounter of minority groups with the healthcare system and researchers, (ii) the integration of biased health data for minority groups in PM initiatives and (iii) the influence of structural racism on the deliverables of PM initiatives for minority groups. We underscore that underappreciation of structural racism by stakeholders involved in the PM ecosystem can be at odds with the ambition of ensuring social and racial justice. Potential specific actions related to the analysed nodes are then formulated to help ensure that PM truly adheres to the goal of leaving no one behind, as endorsed by member states of the United Nations for the 2030 Agenda for Sustainable Development. Conclusion: Structural racism has been entrenched in our societies for centuries and it would be naïve to believe that its impacts will not spill over in the era of PM. PM initiatives need to pay special attention to the discriminatory and harmful impacts that structural racism could have on minority groups involved in their respective projects. It is only by acknowledging and discussing the existence of implicit racial biases and trust issues in healthcare and research domains that proper interventions to remedy them can be implemented.
IntroductionThe digitalization of medicine has led to a considerable growth of heterogeneous health datasets, which could improve healthcare research if integrated into the clinical life cycle. This process requires, amongst other things, the harmonization of these datasets, which is a prerequisite to improve their quality, re-usability and interoperability. However, there is a wide range of factors that either hinder or favor the harmonized collection, sharing and linkage of health data.ObjectiveThis systematic review aims to identify barriers and facilitators to health data harmonization—including data sharing and linkage—by a comparative analysis of studies from Denmark and Switzerland.MethodsPublications from PubMed, Web of Science, EMBASE and CINAHL involving cross-institutional or cross-border collection, sharing or linkage of health data from Denmark or Switzerland were searched to identify the reported barriers and facilitators to data harmonization.ResultsOf the 345 projects included, 240 were single-country and 105 were multinational studies. Regarding national projects, a Swiss study reported on average more barriers and facilitators than a Danish study. Barriers and facilitators of a technical nature were most frequently reported.ConclusionThis systematic review gathered evidence from Denmark and Switzerland on barriers and facilitators concerning data harmonization, sharing and linkage. Barriers and facilitators were strictly interrelated with the national context where projects were carried out. Structural changes, such as legislation implemented at the national level, were mirrored in the projects. This underlines the impact of national strategies in the field of health data. Our findings also suggest that more openness and clarity in the reporting of both barriers and facilitators to data harmonization constitute a key element to promote the successful management of new projects using health data and the implementation of proper policies in this field. Our study findings are thus meaningful beyond these two countries.
Background: Digital Pills (DP) are an innovative drug-device technology that permits to combine traditional medications with a monitoring system that automatically records data about medication adherence as well as patients' physiological data. Although DP are a promising innovation in the field of digital medicine, their use has also raised a number of ethical concerns. These ethical concerns, however, have been expressed principally from a theoretical perspective, whereas an ethical analysis with a more empirically oriented approach is lacking. There is also a lack of clarity about the empirical evidence available concerning the application of this innovative digital medicine. Methods: To map the studies where DP have been tested on patients and discuss the ethically relevant issues evident therein, we performed a scoping review of the empirical literature concerning DP. Results: Our search allowed us to identify 18 papers reporting on studies where DP were tested on patients. These included studies with different designs and involving patients with a variety of conditions. In the empirical literature, a number of issues with ethical relevance were evident. At the patient level, the ethical issues include users' interaction with DP, personal sphere, health-related risks and patients' benefits. At the provider level, ethically relevant issues touch upon the doctor-patient relationship and the question of data access. At the societal level, they concern the benefits to society, the quality of evidence and the dichotomy device-medicine. Conclusions: We conclude that evidence concerning DP is not robust and that more research should be performed and study results made available to evaluate this digital medicine. Moreover, our analysis of the ethically relevant aspects within empirical literature underscores that there are concrete and specific open questions that should be tackled in the ethical discussion about this new technological solution.
As accessing, collecting, and storing personal information become increasingly easier, the secondary use of data has the potential to make healthcare research more cost and time effective. The widespread reuse of data, however, raises important ethical and policy issues, especially because of the sensitive nature of genetic and health-related information. Regulation is thus crucial to determine the conditions upon which data can be reused. In this respect, the question emerges whether it is appropriate to endorse genetic exceptionalism and grant genetic data an exceptional status with respect to secondary use requirements. Using Swiss law as a case study, it is argued that genetic exceptionalism in secondary use regulation is not justified for three reasons. First, although genetic data have particular features, also other non-genetic data can be extremely sensitive. Second, having different regulatory requirements depending on the nature of data hinders the creation of comprehensible consent forms. Third, empirical evidence about public preferences concerning data reuse suggests that exceptional protection for
Advances in information technology are changing public health at an unprecedented rate. Participatory surveillance systems are contributing to public health by actively engaging digital (eg, Web-based) communities of volunteer citizens to report symptoms and other pertinent information on public health threats and also by empowering individuals to promptly respond to them. However, this digital model raises ethical issues on top of those inherent in traditional forms of public health surveillance. Research ethics are undergoing significant changes in the digital era where not only participants’ physical and psychological well-being but also the protection of their sensitive data have to be considered. In this paper, the digital platform of Influenzanet is used as a case study to illustrate those ethical challenges posed to participatory surveillance systems using digital platforms and mobile apps. These ethical challenges include the implementation of electronic consent, the protection of participants’ privacy, the promotion of justice, and the need for interdisciplinary capacity building of research ethics committees. On the basis of our analysis, we propose a framework to regulate and strengthen ethical approaches in the field of digital public health surveillance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.