We developed a system that can automatically classify cases of scoliosis secondary to neurofibromatosis type 1 (NF1-S) using deep learning algorithms (DLAs) and improve the accuracy and effectiveness of classification, thereby assisting surgeons with the auxiliary diagnosis. Methods: Comprehensive experiments in NF1 classification were performed based on a dataset consisting 211 NF1-S (131 dystrophic and 80 nondystrophic NF1-S) patients. Additionally, 100 congenital scoliosis (CS), 100 adolescent idiopathic scoliosis (AIS) patients, and 114 normal controls were used for experiments in primary classification. For identification of NF1-S with nondystrophic or dystrophic curves, we devised a novel network (i.e., Bilateral convolutional neural network [CNN]) utilizing a bilinear-like operation to discover the similar interest features between whole spine AP and lateral x-ray images. The performance of Bilateral CNN was compared with spine surgeons, conventional DLAs (i.e., and Bilinear CNN [BCNN]), recently proposed DLAs (i.e., ShuffleNet, MobileNet, and EfficientNet), and Two-path BCNN which was the extension of BCNN using AP and lateral x-ray images as inputs. Results: In NF1 classification, our proposed Bilateral CNN with 80.36% accuracy outperformed the other seven DLAs ranging from 61.90% to 76.19% with fivefold cross-validation. It also outperformed the spine surgeons (with an average accuracy of 77.5% for the senior surgeons and 65.0% for the junior surgeons). Our method is highly generalizable due to the proposed methodology and data augmentation. Furthermore, the heatmaps extracted by Bilateral CNN showed curve pattern and morphology of ribs and vertebrae contributing most to the classification results. In primary classification, our proposed method with an accuracy of 87.92% also outperformed all the other methods with varied accuracies between 52.58% and 83.35% with fivefold cross-validation. Conclusions: The proposed Bilateral CNN can automatically capture representative features for classifying NF1-S utilizing AP and lateral x-ray images, leading to a relatively good performance. Moreover, the proposed method can identify other spine deformities for auxiliary diagnosis.
With the increasing amount of multimedia data, cross-modality hashing has made great progress as it achieves sub-linear search time and low memory space. However, due to the huge discrepancy between different modalities, most existing cross-modality hashing methods cannot learn unified hash codes and functions for modalities at the same time. The gap between separated hash codes and functions further leads to bad search performance. In this paper, to address the issues above, we propose a novel end-to-end Deep Unified Cross-Modality Hashing method named DUCMH, which is able to jointly learn unified hash codes and unified hash functions by alternate learning and data alignment. Specifically, to reduce the discrepancy between image and text modalities, DUCMH utilizes data alignment to learn an auxiliary image to text mapping under the supervision of image-text pairs. For text data, hash codes can be obtained by unified hash functions, while for image data, DUCMH first maps images to texts by the auxiliary mapping, and then uses the mapped texts to obtain hash codes. DUCMH utilizes alternate learning to update unified hash codes and functions. Extensive experiments on three representative image-text datasets demonstrate the superiority of our DUCMH over several state-of-the-art cross-modality hashing methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.