2022
DOI: 10.1007/s11548-022-02728-7
|View full text |Cite
|
Sign up to set email alerts
|

Enhancement of instrumented ultrasonic tracking images using deep learning

Abstract: Purpose: Instrumented ultrasonic tracking provides needle localisation during ultrasound-guided minimally invasive percutaneous procedures. Here, a post-processing framework based on a convolutional neural network (CNN) is proposed to improve the spatial resolution of ultrasonic tracking images. Methods: The custom ultrasonic tracking system comprised a needle with an integrated fibre-optic ultrasound (US) transmitter and a clinical US probe for receiving … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…For these reasons, further work is required to improve the out-of-plane tracking provided by the UNT system. It is likely that the relative amplitudes of the signals received by the FOH from each aperture of the US probe encode its elevational position, as has been shown in similar work [ 54 ], and some preliminary work has started looking at using machine learning techniques to decode this, as has been demonstrated in the literature [ 55 , 56 , 57 ]. In [ 54 ], a photoacoustic beacon, rather than FOH, is embedded in the surgical instrument, and transmissions from it are received by the imaging probe and used to determine its 3D position.…”
Section: Discussionmentioning
confidence: 99%
“…For these reasons, further work is required to improve the out-of-plane tracking provided by the UNT system. It is likely that the relative amplitudes of the signals received by the FOH from each aperture of the US probe encode its elevational position, as has been shown in similar work [ 54 ], and some preliminary work has started looking at using machine learning techniques to decode this, as has been demonstrated in the literature [ 55 , 56 , 57 ]. In [ 54 ], a photoacoustic beacon, rather than FOH, is embedded in the surgical instrument, and transmissions from it are received by the imaging probe and used to determine its 3D position.…”
Section: Discussionmentioning
confidence: 99%