Background
Widespread availability of large data sets through warehousing of electronic health records coupled with increasingly sophisticated information technology and related statistical methods offer great potential for a variety of applications for health and disease surveillance, developing predictive models and advancing decision support for clinicians. However, use of such ‘big data’ mining and discovery techniques has also raised ethical issues such as how to balance privacy and autonomy with the wider public benefits of data sharing. More specifically, electronic data are being increasingly used to identify individual characteristics which can be useful for clinical prediction and management, but that were not previously disclosed to a clinician. This process in computer parlance is called electronic phenotyping, and has a number of ethical implications.
Approach
Using the Belmont Report’s principles of respect for persons, beneficence, and justice as a framework, we examined the ethical issues posed by electronic phenotyping.
Findings
Ethical issues identified include the ability of the patient to consent for the use of their information, the ability to suppress pediatric information, ensuring that the potential benefits justify the risks of harm to patients, and acknowledging that the clinician’s biases or stereotypes, conscious or unintended, may also become a factor in the therapeutic interaction. We illustrate these issues with two vignettes, using the person characteristic of gender minority status (i.e., transgender identity) and the health history characteristic of substance abuse.
Conclusion
Big data mining has the potential to uncover patient characteristics previously obscured which can provide clinicians with beneficial clinical information. Hence, ethical guidelines must be updated to ensure that electronic phenotyping supports the principles of respect for persons, beneficence, and justice.