Digital pathology and image analysis potentially provide greater accuracy, reproducibility and standardisation of pathology‐based trial entry criteria and endpoints, alongside extracting new insights from both existing and novel features. Image analysis has great potential to identify, extract and quantify features in greater detail in comparison to pathologist assessment, which may produce improved prediction models or perform tasks beyond manual capability. In this article, we provide an overview of the utility of such technologies in clinical trials and provide a discussion of the potential applications, current challenges, limitations and remaining unanswered questions that require addressing prior to routine adoption in such studies. We reiterate the value of central review of pathology in clinical trials, and discuss inherent logistical, cost and performance advantages of using a digital approach. The current and emerging regulatory landscape is outlined. The role of digital platforms and remote learning to improve the training and performance of clinical trial pathologists is discussed. The impact of image analysis on quantitative tissue morphometrics in key areas such as standardisation of immunohistochemical stain interpretation, assessment of tumour cellularity prior to molecular analytical applications and the assessment of novel histological features is described. The standardisation of digital image production, establishment of criteria for digital pathology use in pre‐clinical and clinical studies, establishment of performance criteria for image analysis algorithms and liaison with regulatory bodies to facilitate incorporation of image analysis applications into clinical practice are key issues to be addressed to improve digital pathology incorporation into clinical trials.
AimsTo evaluate if a deep learning algorithm can be trained to identify tumour-infiltrating lymphocytes (TILs) in tissue samples of testicular germ cell tumours and to assess whether the TIL counts correlate with relapse status of the patient.MethodsTILs were manually annotated in 259 tumour regions from 28 whole-slide images (WSIs) of H&E-stained tissue samples. A deep learning algorithm was trained on half of the regions and tested on the other half. The algorithm was further applied to larger areas of tumour WSIs from 89 patients and correlated with clinicopathological data.ResultsA correlation coefficient of 0.89 was achieved when comparing the algorithm with the manual TIL count in the test set of images in which TILs were present (n=47). In the WSI regions from the 89 patient samples, the median TIL density was 1009/mm2. In seminomas, none of the relapsed patients belonged to the highest TIL density tertile (>2011/mm2). TIL quantifications performed visually by three pathologists on the same tumours were not significantly associated with outcome. The average interobserver agreement between the pathologists when assigning a patient into TIL tertiles was 0.32 (Kappa test) compared with 0.35 between the algorithm and the experts, respectively. A higher TIL density was associated with a lower clinical tumour stage, seminoma histology and lack of lymphovascular invasion.ConclusionsDeep learning–based image analysis can be used for detecting TILs in testicular germ cell cancer more objectively and it has potential for use as a prognostic marker for disease relapse.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.