Background Fast and accurate diagnostics are key for personalised medicine. Particularly in cancer, precise diagnosis is a prerequisite for targeted therapies, which can prolong lives. In this work, we focus on the automatic identification of gastroesophageal adenocarcinoma (GEA) patients that qualify for a personalised therapy targeting epidermal growth factor receptor 2 (HER2). We present a deep-learning method for scoring microscopy images of GEA for the presence of HER2 overexpression. Methods Our method is based on convolutional neural networks (CNNs) trained on a rich dataset of 1602 patient samples and tested on an independent set of 307 patient samples. We additionally verified the CNN’s generalisation capabilities with an independent dataset with 653 samples from a separate clinical centre. We incorporated an attention mechanism in the network architecture to identify the tissue regions, which are important for the prediction outcome. Our solution allows for direct automated detection of HER2 in immunohistochemistry-stained tissue slides without the need for manual assessment and additional costly in situ hybridisation (ISH) tests. Results We show accuracy of 0.94, precision of 0.97, and recall of 0.95. Importantly, our approach offers accurate predictions in cases that pathologists cannot resolve and that require additional ISH testing. We confirmed our findings in an independent dataset collected in a different clinical centre. The attention-based CNN exploits morphological information in microscopy images and is superior to a predictive model based on the staining intensity only. Conclusions We demonstrate that our approach not only automates an important diagnostic process for GEA patients but also paves the way for the discovery of new morphological features that were previously unknown for GEA pathology.
BackgroundFast and accurate diagnostics are key for personalized medicine. Particularly in cancer, precise diagnosis is a prerequisite for targeted therapies which can prolong lives. In this work we focus on the automatic identification of gastroesophageal adenocarcinoma (GEA) patients that qualify for a personalized therapy targeting epidermal growth factor receptor 2 (HER2). We present a deep learning method for scoring microscopy images of GEA for the presence of HER2 overexpression.MethodsOur method is based on convolutional neural networks (CNNs) trained on a rich dataset of 1,602 patient samples and tested on an independent set of 307 patient samples. We incorporated an attention mechanism in the CNN architecture to identify the tissue regions in these patient cases which the network has detected as important for the prediction outcome. Our solution allows for direct automated detection of HER2 in immunohistochemistry-stained tissue slides without the need for manual assessment and additional costly in situ hybridization (ISH) tests.ResultsWe show accuracy of 0.94, precision of 0.97, and recall of 0.95. Importantly, our approach offers accurate predictions in cases that pathologists cannot resolve, requiring additional ISH testing. We confirmed our findings in an independent dataset collected in a different clinical center.ConclusionsWe demonstrate that our approach not only automates an important diagnostic process for GEA patients but also paves the way for the discovery of new morphological features that were previously unknown for GEA pathology.
Background Oesophagectomy is an operation with a high risk of postoperative complications. The aim of this single-centre retrospective study was to apply machine-learning methods to predict complications (Clavien–Dindo grade IIIa or higher) and specific adverse events. Methods Patients with resectable adenocarcinoma or squamous cell carcinoma of the oesophagus and gastro-oesophageal junction who underwent Ivor Lewis oesophagectomy between 2016 and 2021 were included. The tested algorithms were logistic regression after recursive feature elimination, random forest, k-nearest neighbour, support vector machine, and neural network. The algorithms were also compared with a current risk score (the Cologne risk score). Results 457 patients had Clavien–Dindo grade IIIa or higher complications (52.9 per cent) versus 407 patients with Clavien–Dindo grade 0, I, or II complications (47.1 per cent). After 3-fold imputation and 3-fold cross-validation, the overall accuracies were: logistic regression after recursive feature elimination, 0.528; random forest, 0.535; k-nearest neighbour, 0.491; support vector machine, 0.511; neural network, 0.688; and Cologne risk score, 0.510. For medical complications, the results were: logistic regression after recursive feature elimination, 0.688; random forest, 0.664; k-nearest neighbour, 0.673; support vector machine, 0.681; neural network, 0.692; and Cologne risk score, 0.650. For surgical complications, the results were: logistic regression after recursive feature elimination, 0.621; random forest, 0.617; k-nearest neighbour, 0.620; support vector machine, 0.634; neural network, 0.667; and Cologne risk score, 0.624. The calculated area under the curve of the neural network was 0.672 for Clavien–Dindo grade IIIa or higher, 0.695 for medical complications, and 0.653 for surgical complications. Conclusion The neural network scored the highest accuracies compared with all of the other models for the prediction of postoperative complications after oesophagectomy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.