2021
DOI: 10.1016/j.euf.2019.04.009
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning for Real-time, Automatic, and Scanner-adapted Prostate (Zone) Segmentation of Transrectal Ultrasound, for Example, Magnetic Resonance Imaging–transrectal Ultrasound Fusion Prostate Biopsy

Abstract: DOI to the publisher's website.• The final author version and the galley proof are versions of the publication after peer review.• The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rightsCopyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 46 publications
(25 citation statements)
references
References 35 publications
0
25
0
Order By: Relevance
“…This might be due to the anatomic or physiological differences between zones; however, it could also be a result of less robust parameter estimation farther away from the probe (i.e., in the TZ), due to the increasing impact of attenuation and shadowing. This stresses the need for adequate zonal segmentation in the proposed framework, here obtained through deep learning [28,29]. The multiparametric score is shown to scale with tumor Gleason grade, with significant differences between benign, insignificant, and significant disease.…”
Section: Discussionmentioning
confidence: 91%
See 1 more Smart Citation
“…This might be due to the anatomic or physiological differences between zones; however, it could also be a result of less robust parameter estimation farther away from the probe (i.e., in the TZ), due to the increasing impact of attenuation and shadowing. This stresses the need for adequate zonal segmentation in the proposed framework, here obtained through deep learning [28,29]. The multiparametric score is shown to scale with tumor Gleason grade, with significant differences between benign, insignificant, and significant disease.…”
Section: Discussionmentioning
confidence: 91%
“…Firstly, the prostate is located and delineated in each modality. To this end, we employed an automated deep learning-based TRUS segmentation algorithm on the side-view fundamental B-mode images of both the SWE and DCE-US acquisition [28,29]. For DCE-US, the prostate position during wash-in (i.e., at 30 s) was used as a reference.…”
Section: Prostate Segmentationmentioning
confidence: 99%
“…In our example of the prostate, quick estimation of elastic properties would eventually not only support the assessment of potential disease, but also registration technology that takes into account mechanical properties [28] and the (automatic) identification of anatomical zones [29]. Moreover, sSWE features can potentially play an important role in the design of ultrasound-based computer-aided detection approaches for prostate cancer [30].…”
Section: Discussionmentioning
confidence: 98%
“…Manual segmentation of the prostate on TRUS imaging is time-consuming and often not reproducible. For these reasons, several studies have applied deep learning to automatically segment the prostate using TRUS imaging [25][26][27][28][29][30][31].…”
Section: Challenges Applying Deep Learning To Abdominal Us Imagingmentioning
confidence: 99%