Ultrafast fluorescent confocal microscopy is a hypothetical approach for breast cancer detection because of its potential to achieve instantaneous, high‐resolution images of cellular‐level tissue features. Traditional approaches such as mammography and biopsy are laborious, invasive, and inefficient; confocal microscopy offers many benefits over these approaches. However, confocal microscopy enables the exact differentiation of malignant cells, the expeditious examination of extensive tissue sections, and the optical sectioning of tissue samples into tiny slices. The primary goal should be to prevent cancer altogether, although detecting it early can help achieve that objective. This research presents a novel Breast Histopathology Convolutional Neural Network (BHCNN) for feature extraction and recursive feature elimination method for selecting the most significant features. The proposed approach utilizes full slide images to identify tissue in regions affected by invasive ductal carcinoma. In addition, a transfer learning approach is employed to enhance the performance and accuracy of the models in detecting breast cancer, while also reducing computation time by modifying the final layer of the proposed model. The results showed that the BHCNN model outperformed other models in terms of accuracy, achieving a testing accuracy of 98.42% and a training accuracy of 99.94%. The confusion matrix results show that the IDC positive (+) class achieved 97.44% accuracy and 2.56% inaccurate results, while the IDC negative (−) class achieved 98.73% accuracy and 1.27% inaccurate results. Furthermore, the model achieved less than 0.05 validation loss.Research Highlights
The objective is to develop an innovative framework using ultra‐fast fluorescence confocal microscopy, particularly for the challenging problem of breast cancer diagnosis. This framework will extract essential features from microscopy and employ a gradient recurrent unit for detection.
The proposed research offers significant potential in enhancing medical imaging through the provision of a reliable and resilient system for precise diagnosis of breast cancer, thereby propelling the progression of state‐of‐the‐art medical technology.
The most suitable feature was determined using BHRFE optimization techniques after retrieving the features by proposed model. Finally, the features chosen are integrated into a proposed methodology, which is then classified using a GRU deep model.
The aforementioned research has significant potential to improve medical imaging by providing a complex and reliable system for precise evaluation of breast cancer, hence advancing the development of cutting‐edge medical technology.