Fetal distress is a symptom of fetal intrauterine hypoxia, which is seriously harmful to both the fetus and the pregnant woman. The current primary clinical tool for the assessment of fetal distress is Cardiotocography (CTG). Due to subjective variability, physicians often interpret CTG results inconsistently, hence the need to develop an auxiliary diagnostic system for fetal distress. Although the deep learning-based fetal distress-assisted diagnosis model has a high classification accuracy, the model not only has a large number of parameters but also requires a large number of computational resources, which is difficult to deploy to practical end-use scenarios. Therefore, this paper proposes a lightweight fetal distress-assisted diagnosis network, LW-FHRNet, based on a cross-channel interactive attention mechanism. The wavelet packet decomposition technique is used to convert the one-dimensional fetal heart rate (FHR) signal into a two-dimensional wavelet packet coefficient matrix map as the network input layer to fully obtain the feature information of the FHR signal. With ShuffleNet-v2 as the core, a local cross-channel interactive attention mechanism is introduced to enhance the model’s ability to extract features and achieve effective fusion of multichannel features without dimensionality reduction. In this paper, the publicly available database CTU-UHB is used for the network performance evaluation. LW-FHRNet achieves 95.24% accuracy, which meets or exceeds the classification results of deep learning-based models. Additionally, the number of model parameters is reduced many times compared with the deep learning model, and the size of the model parameters is only 0.33 M. The results show that the lightweight model proposed in this paper can effectively aid in fetal distress diagnosis.
Cardiotocography (CTG) monitoring is an important medical diagnostic tool for fetal well-being evaluation in late pregnancy. In this regard, intelligent CTG classification based on Fetal Heart Rate (FHR) signals is a challenging research area that can assist obstetricians in making clinical decisions, thereby improving the efficiency and accuracy of pregnancy management. Most existing methods focus on one specific modality, that is, they only detect one type of modality and inevitably have limitations such as incomplete or redundant source domain feature extraction, and poor repeatability. This study focuses on modeling multimodal learning for Fetal Distress Diagnosis (FDD); however, exists three major challenges: unaligned multimodalities; failure to learn and fuse the causality and inclusion between multimodal biomedical data; modality sensitivity, that is, difficulty in implementing a task in the absence of modalities. To address these three issues, we propose a Multimodal Medical Information Fusion framework named MMIF, where the Category Constrained-Parallel ViT model (CCPViT) was first proposed to explore multimodal learning tasks and address the misalignment between multimodalities. Based on CCPViT, a cross-attention-based image-text joint component is introduced to establish a Multimodal Representation Alignment Network model (MRAN), explore the deep-level interactive representation between cross-modal data, and assist multimodal learning. Furthermore, we designed a simple-structured FDD test model based on the highly modal alignment MMIF, realizing task delegation from multimodal model training (image and text) to unimodal pathological diagnosis (image). Extensive experiments, including model parameter sensitivity analysis, cross-modal alignment assessment, and pathological diagnostic accuracy evaluation, were conducted to show our models’ superior performance and effectiveness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.