2017
DOI: 10.1007/978-3-319-54057-3_8
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Segmentation of Non-rigid Surgical Tools Based on Deep Learning and Tracking

Abstract: Abstract. Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We vali… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
66
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 82 publications
(73 citation statements)
references
References 20 publications
0
66
0
Order By: Relevance
“…As specified in the guidelines, we preformed a cross-validation by leaving one surgery out of the training data. The segmentation result was compared to generic methods as well as algorithms that were explicitly published for this task [14,7,18]. Garcia P. Herrera et al [7] proposed a Fully convolutional network for segmentation in minimally invasive surgery.…”
Section: Experimental Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…As specified in the guidelines, we preformed a cross-validation by leaving one surgery out of the training data. The segmentation result was compared to generic methods as well as algorithms that were explicitly published for this task [14,7,18]. Garcia P. Herrera et al [7] proposed a Fully convolutional network for segmentation in minimally invasive surgery.…”
Section: Experimental Evaluationmentioning
confidence: 99%
“…The segmentation result was compared to generic methods as well as algorithms that were explicitly published for this task [14,7,18]. Garcia P. Herrera et al [7] proposed a Fully convolutional network for segmentation in minimally invasive surgery. To achieve real-time performance, they applied the network only on every couple of frames and propagated the information with optical flow (FCN+OF).…”
Section: Experimental Evaluationmentioning
confidence: 99%
“…Name: Zenodo [10,9]. Other examples include real-time stereo reconstruction [48] and probe tracking [49] in robotic surgery, surveillance endoscopies [50], real-time panorama image synthesis [51], vehicle surveillance [52] and content-based video identification [53].…”
Section: Software Location Archivementioning
confidence: 99%
“…GIFT-Surg involves major innovations in science and engineering combined with clinical translation for improving fetal therapy and diagnosis by providing advanced image processing and visualisation capabilities. We have already obtained promising results with a number of novel methods including real-time mosaicing of the placenta [9] and sensor-free real-time instrument tracking [10]. These methods rely on real-time video streams from medical devices such as endoscopes and ultrasound probes.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, a serious of methods have been proposed to segment surgical instruments. Luis et al [3] presented a network based on Fully Convolutional Networks(FCN) and optic flow to solve problems such as occlusion and deformation of surgical instruments. RASNet [4] adopted an attention module to emphasize the targets region and improve the feature representation.…”
Section: Introductionmentioning
confidence: 99%