Purpose
Prostate cancer classification has a significant impact on the prognosis and treatment planning of patients. Currently, this classification is based on the Gleason score analysis of biopsied tissues, which is neither accurate nor risk free. This study aims to learn discriminative features in prostate images and assist physicians in classifying prostate cancer automatically.
Methods
We develop a novel multiparametric magnetic resonance transfer learning (MPTL) method to automatically stage prostate cancer. We first establish a deep convolutional neural network with three branch architectures, which transfer pretrained model to compute features from multiparametric MRI images (mp‐MRI): T2w transaxial, T2w sagittal, and apparent diffusion coefficient (ADC). The learned features are concatenated to represent information of mp‐MRI sequences. A new image similarity constraint is then proposed to enable the distribution of the features within the same category in a narrow angle region. With the joint constraints of softmax loss and image similarity loss in the fine‐tuning process, the MPTL can provide descriptive features with intraclass compactness and interclass separability.
Results
Two cohorts: 132 cases from our institutional review board‐approved patient database and 112 cases from the PROSTATEx‐2 Challenge are utilized to evaluate the robustness and effectiveness of the proposed MPTL model. Our model achieved high accuracy of prostate cancer classification (accuracy of 86.92%). Moreover, the comparison results demonstrate that our method outperforms both hand‐crafted feature‐based methods and existing deep learning models in prostate cancer classification with higher accuracy.
Conclusion
The experiment results showed that the proposed method can learn discriminative features in prostate images and classify the cancer accurately. Our MPTL model could be further applied in the clinical practice to provide valuable information for cancer treatment and precision medicine.