The early detection of polyps could help prevent colorectal cancer. The automated detection of polyps on the colon walls could reduce the number of false negatives that occur due to manual examination errors or polyps being hidden behind folds, and could also help doctors locate polyps from screening tests such as colonoscopy and wireless capsule endoscopy. Losing polyps may result in lesions evolving badly. In this paper, we propose a modified region-based convolutional neural network (R-CNN) by generating masks around polyps detected from still frames. The locations of the polyps in the image are marked, which assists the doctors examining the polyps. The features from the polyp images are extracted using pre-trained Resnet-50 and Resnet-101 models through feature extraction and fine-tuning techniques. Various publicly available polyp datasets are analyzed with various pertained weights. It is interesting to notice that fine-tuning with balloon data (polyp-like natural images) improved the polyp detection rate. The optimum CNN models on colonoscopy datasets including CVC-ColonDB, CVC-PolypHD, and ETIS-Larib produced values (F1 score, F2 score) of (90.73, 91.27), (80.65, 79.11), and (76.43, 78.70) respectively. The best model on the wireless capsule endoscopy dataset gave a performance of (96.67, 96.10). The experimental results indicate the better localization of polyps compared to recent traditional and deep learning methods.
Our work facilitates the identification of veterans who may be at risk for abdominal aortic aneurysms (AAA) based on the 2007 mandate to screen all veteran patients that meet the screening criteria. The main research objective is to automatically index three clinical conditions: pertinent negative AAA, pertinent positive AAA, and visually unacceptable image exams. We developed and evaluated a ConText-based algorithm with the GATE (General Architecture for Text Engineering) development system to automatically classify 1402 ultrasound radiology reports for AAA screening. Using the results from JAPE (Java Annotation Pattern Engine) transducer rules, we developed a feature vector to classify the radiology reports with a decision table classifier. We found that ConText performed optimally on precision and recall for pertinent negative (0.99 (0.98-0.99), 0.99 (0.99-1.00)) and pertinent positive AAA detection (0.98 (0.95-1.00), 0.97 (0.92-1.00)), and respectably for determination of non-diagnostic image studies (0.85 (0.77-0.91), 0.96 (0.91-0.99)). In addition, our algorithm can determine the AAA size measurements for further characterization of abnormality. We developed and evaluated a regular expression based algorithm using GATE for determining the three contextual conditions: pertinent negative, pertinent positive, and non-diagnostic from radiology reports obtained for evaluating the presence or absence of abdominal aortic aneurysm. ConText performed very well at identifying the contextual features. Our study also discovered contextual trigger terms to detect sub-standard ultrasound image quality. Limitations of performance included unknown dictionary terms, complex sentences, and vague findings that were difficult to classify and properly code.
Summary The Veterans Affairs Precision Oncology Data Repository (VA-PODR) is a large, nationwide repository of de-identified data on patients diagnosed with cancer at the Department of Veterans Affairs (VA). Data include longitudinal clinical data from the VA's nationwide electronic health record system and the VA Central Cancer Registry, targeted tumor sequencing data, and medical imaging data including computed tomography (CT) scans and pathology slides. A subset of the repository is available at the Genomic Data Commons (GDC) and The Cancer Imaging Archive (TCIA), and the full repository is available through the Veterans Precision Oncology Data Commons (VPODC). By releasing this de-identified dataset, we aim to advance Veterans' health care through enabling translational research on the Veteran population by a wide variety of researchers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.