2010
DOI: 10.1007/978-3-642-13711-2_4
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Phases Recognition in Pituitary Surgeries by Microscope Images Classification

Abstract: Abstract. The segmentation of the surgical workflow might be helpful for providing context-sensitive user interfaces, or generating automatic report. Our approach focused on the automatic recognition of surgical phases by microscope image classification. Our workflow, including images features extraction, image database labelisation, Principal Component Analysis (PCA) transformation and 10-fold cross-validation studies was performed on a specific type of neurosurgical intervention, the pituitary surgery. Six p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 29 publications
(28 citation statements)
references
References 21 publications
0
28
0
Order By: Relevance
“…Most of the prior work focuses on the detection of the instruments used during surgery or in the operating room [13,14,15,16,17,18,19,20] using techniques such as dynamic time warping, support vector machines and HMMs. However, these techniques use only frame-level features, such as color, texture and shape-based cues.…”
Section: Introductionmentioning
confidence: 99%
“…Most of the prior work focuses on the detection of the instruments used during surgery or in the operating room [13,14,15,16,17,18,19,20] using techniques such as dynamic time warping, support vector machines and HMMs. However, these techniques use only frame-level features, such as color, texture and shape-based cues.…”
Section: Introductionmentioning
confidence: 99%
“…To solve this problem, an analysis of endoscopic and microscopic video has been proposed [31][32][33]. Markov models and dynamic time warping (DTW) were used to identify single worksteps based on the presence of surgical instruments [34][35][36][37], and radiofrequency identification (RFID), visual approaches, and weight analysis methods [38] have been employed to recognize instrument use [37,39,40].…”
Section: Data Acquisition By Sensor Systemsmentioning
confidence: 99%
“…All of these approaches address different levels of granularity from ranged gesture recognition (surgemes) [42,44,45] to low-level tasks [33], high-level tasks [46,47], and intervention phases [35,48]. James et al [49] tried to recognize the current surgical situation indirectly by estimating the positions and movements of the members of surgical teams within the OR or by deriving information from other indirect features.…”
Section: Data Acquisition By Sensor Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…We already presented this approach in a previous paper [12]. Each frame was represented by a signature composed of low-level spatial features (RGB space, co-occurrence matrix with Haralick descriptors [13], spatial moments [14], and Discrete Cosine Transform (DCT) [15] coefficients).…”
Section: Other Visual Cuesmentioning
confidence: 99%