The information captured by the gist signal, which refers to radiologists’ first impression arising from an initial global image processing, is poorly understood. We examined whether the gist signal can provide complementary information to data captured by radiologists (experiment 1), or computer algorithms (experiment 2) based on detailed mammogram inspection. In the first experiment, 19 radiologists assessed a case set twice, once based on a half-second image presentation (i.e., gist signal) and once in the usual viewing condition. Their performances in two viewing conditions were compared using repeated measure correlation (rm-corr). The cancer cases (19 cases × 19 readers) exhibited non-significant trend with rm-corr = 0.012 (p = 0.82, CI: −0.09, 0.12). For normal cases (41 cases × 19 readers), a weak correlation of rm-corr = 0.238 (p < 0.001, CI: 0.17, 0.30) was found. In the second experiment, we combined the abnormality score from a state-of-the-art deep learning-based tool (DL) with the radiological gist signal using a support vector machine (SVM). To obtain the gist signal, 53 radiologists assessed images based on half-second image presentation. The SVM performance for each radiologist and an average reader, whose gist responses were the mean abnormality scores given by all 53 readers to each image was assessed using leave-one-out cross-validation. For the average reader, the AUC for gist, DL, and the SVM, were 0.76 (CI: 0.62–0.86), 0.79 (CI: 0.63–0.89), and 0.88 (CI: 0.79–0.94). For all readers with a gist AUC significantly better than chance-level, the SVM outperformed DL. The gist signal provided malignancy evidence with no or weak associations with the information captured by humans in normal radiologic reporting, which involves detailed mammogram inspection. Adding gist signal to a state-of-the-art deep learning-based tool improved its performance for the breast cancer detection.
The global radiomic signature extracted from mammograms can indicate that malignancy appearances are present within an image. This study focuses on a set of 129 screen-detected breast malignancies, which were also visible on the prior screening examinations (i.e., missed cancers based on the priors). All cancer signs on the prior examinations were actionable based on the opinion of a panel of three experienced radiologists, who retrospectively interpreted the prior examinations (knowing that a later screening round had revealed a cancer). We investigated if the global radiomic signature could differentiate between screening rounds: when the cancer was detected (“identified cancers”), from the round immediately before (“missed cancers”). Both identified cancers and “missed cancers” were collected using a single vendor technology. A set of “normals”, matched based on mammography units, was also retrieved from a screening archive. We extracted a global radiomic signature, containing first and second-order statistics features. Three classification tasks were considered: (1) “identified cancers” vs “missed cancers”, (2) “identified cancers” vs “normals”, (3) “missed cancers” vs “normal”. To train and validate the models, leave-one-case-out cross-validation was used. The classifier resulted in an AUC of 0.66 (95%CI=0.60-0.73, P<0.05) for “missed cancers” vs “identified cancers” and an AUC of 0.65 (95%CI=0.60-0.69, P<0.05) for “normals” vs “identified cancers”. However, the AUC of the classifier for differentiating “normals” from “missed cancers” was at chance-level (AUC=0.53 (95%CI=0.48-0.58, P=0.23). Therefore, eliminating some of these “missed” cancers in clinical practice would be very challenging as the global signal of the malignancy that help with a diagnosis, are at best weak.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.