The identification of phenotypic changes in breast cancer (BC) histopathology on account of corresponding molecular changes is of significant clinical importance in predicting disease outcome. One such example is the presence of lymphocytic infiltration (LI) in histopathology, which has been correlated with nodal metastasis and distant recurrence in HER2+ BC patients. In this paper, we present a computer-aided diagnosis (CADx) scheme to automatically detect and grade the extent of LI in digitized HER2+ BC histopathology. Lymphocytes are first automatically detected by a combination of region growing and Markov random field algorithms. Using the centers of individual detected lymphocytes as vertices, three graphs (Voronoi diagram, Delaunay triangulation, and minimum spanning tree) are constructed and a total of 50 image-derived features describing the arrangement of the lymphocytes are extracted from each sample. A nonlinear dimensionality reduction scheme, graph embedding (GE), is then used to project the high-dimensional feature vector into a reduced 3-D embedding space. A support vector machine classifier is used to discriminate samples with high and low LI in the reduced dimensional embedding space. A total of 41 HER2+ hematoxylin-and-eosin-stained images obtained from 12 patients were considered in this study. For more than 100 three-fold cross-validation trials, the architectural feature set successfully distinguished samples of high and low LI levels with a classification accuracy greater than 90%. The popular unsupervised Varma-Zisserman texton-based classification scheme was used for comparison and yielded a classification accuracy of only 60%. Additionally, the projection of the 50 image-derived features for all 41 tissue samples into a reduced dimensional space via GE allowed for the visualization of a smooth manifold that revealed a continuum between low, intermediate, and high levels of LI. Since it is known that extent of LI in BC biopsy specimens is a prognostic indicator, our CADx scheme will potentially help clinicians determine disease outcome and allow them to make better therapy recommendations for patients with HER2+ BC.
Automated detection and segmentation of nuclear and glandular structures is critical for classification and grading of prostate and breast cancer histopathology. In this paper, we present a methodology for automated detection and segmentation of structures of interest in digitized histopathology images. The scheme integrates image information from across three different scales: (1) lowlevel information based on pixel values, (2) high-level information based on relationships between pixels for object detection, and (3) domain-specific information based on relationships between histological structures. Low-level information is utilized by a Bayesian classifier to generate a likelihood that each pixel belongs to an object of interest. High-level information is extracted in two ways: (i) by a level-set algorithm, where a contour is evolved in the likelihood scenes generated by the Bayesian classifier to identify object boundaries, and (ii) by a template matching algorithm, where shape models are used to identify glands and nuclei from the low-level likelihood scenes. Structural constraints are imposed via domainspecific knowledge in order to verify whether the detected objects do indeed belong to structures of interest. In this paper we demonstrate the utility of our glandular and nuclear segmentation algorithm in accurate extraction of various morphological and nuclear features for automated grading of (a) prostate cancer, (b) breast cancer, and (c) distinguishing between cancerous and benign breast histology specimens. The efficacy of our segmentation algorithm is evaluated by comparing breast and prostate cancer grading and benign vs. cancer discrimination accuracies with corresponding accuracies obtained via manual detection and segmentation of glands and nuclei.
Purpose:To determine the feasibility of using a computer-aided diagnosis (CAD) system to differentiate among triple-negative breast cancer, estrogen receptor (ER)-positive cancer, human epidermal growth factor receptor type 2 (HER2)-positive cancer, and benign fibroadenoma lesions on dynamic contrast material-enhanced (DCE) magnetic resonance (MR) images. Materials andMethods:This is a retrospective study of prospectively acquired breast MR imaging data collected from an institutional review board-approved, HIPAAcompliant study between 2002 and 2007. Written informed consent was obtained from all patients. The authors collected DCE MR images from 65 women with 76 breast lesions who had been recruited into a larger study of breast MR imaging. The women had triple-negative (n = 21), ER-positive (n = 25), HER2-positive (n = 18), or fibroadenoma (n = 12) lesions. All lesions were classified as Breast Imaging Reporting and Data System category 4 or higher on the basis of previous imaging. Images were subject to quantitative feature extraction, feed-forward feature selection by means of linear discriminant analysis, and lesion classification by using a support vector machine classifier. The area under the receiver operating characteristic curve (A z ) was calculated for each of five lesion classification tasks involving triple-negative breast cancers. Results:For Conclusion:Triple-negative cancers possess certain characteristic features on DCE MR images that can be captured and quantified with CAD, enabling good discrimination of triple-negative cancers from non-triple-negative cancers, as well as between triple-negative cancers and benign fibroadenomas. Such CAD algorithms may provide added diagnostic benefit in identifying the highly aggressive triple-negative cancer phenotype with DCE MR imaging in high-risk women.q RSNA, 2014
Dynamic contrast-enhanced (DCE)-magnetic resonance imaging (MRI) of the breast has emerged as an adjunct imaging tool to conventional X-ray mammography due to its high detection sensitivity. Despite the increasing use of breast DCE-MRI, specificity in distinguishing malignant from benign breast lesions is low, and interobserver variability in lesion classification is high. The novel contribution of this paper is in the definition of a new DCE-MRI descriptor that we call textural kinetics, which attempts to capture spatiotemporal changes in breast lesion texture in order to distinguish malignant from benign lesions. We qualitatively and quantitatively demonstrated on 41 breast DCE-MRI studies that textural kinetic features outperform signal intensity kinetics and lesion morphology features in distinguishing benign from malignant lesions. A probabilistic boosting tree (PBT) classifier in conjunction with textural kinetic descriptors yielded an accuracy of 90%, sensitivity of 95%, specificity of 82%, and an area under the curve (AUC) of 0.92. Graph embedding, used for qualitative visualization of a low-dimensional representation of the data, showed the best separation between benign and malignant lesions when using textural kinetic features. The PBT classifier results and trends were also corroborated via a support vector machine classifier which showed that textural kinetic features outperformed the morphological, static texture, and signal intensity kinetics descriptors. When textural kinetic attributes were combined with morphologic descriptors, the resulting PBT classifier yielded 89% accuracy, 99% sensitivity, 76% specificity, and an AUC of 0.91.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.