2012
DOI: 10.1007/s11701-011-0334-z
|View full text |Cite
|
Sign up to set email alerts
|

An effective visualisation and registration system for image-guided robotic partial nephrectomy

Abstract: Robotic partial nephrectomy is presently the fastest-growing robotic surgical procedure, and in comparison to traditional techniques it offers reduced tissue trauma and likelihood of post-operative infection, while hastening recovery time and improving cosmesis. It is also an ideal candidate for image guidance technology since soft tissue deformation, while still present, is localised and less problematic compared to other surgical procedures. This work describes the implementation and ongoing development of a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0
1

Year Published

2012
2012
2020
2020

Publication Types

Select...
6
2
1

Relationship

4
5

Authors

Journals

citations
Cited by 57 publications
(31 citation statements)
references
References 22 publications
0
30
0
1
Order By: Relevance
“…Some excellent reviews on augmented reality in laparoscopic surgery and image-guided interventions have been provided by [2], [20], [21]. Among methods that utilized preoperative information, some works focused on manually registering 3D preoperative data on 3D surfaces reconstructed from stereo endoscopic video [25], [31] or on reconstructed transparent 3D intraoperative cone-beam image [34]. Other works focused on feature tracking in which corresponding points on endoscopic video and preoperative data are assumed to be known [28].…”
Section: A Related Workmentioning
confidence: 99%
“…Some excellent reviews on augmented reality in laparoscopic surgery and image-guided interventions have been provided by [2], [20], [21]. Among methods that utilized preoperative information, some works focused on manually registering 3D preoperative data on 3D surfaces reconstructed from stereo endoscopic video [25], [31] or on reconstructed transparent 3D intraoperative cone-beam image [34]. Other works focused on feature tracking in which corresponding points on endoscopic video and preoperative data are assumed to be known [28].…”
Section: A Related Workmentioning
confidence: 99%
“…e.g. (Marescaux et al, 2004;Mutter et al, 2010;Nozaki et al, 2012;Pratt et al, 2012)) proposed manual alignment of pre-operatively and intra-operatively acquired images. The majority of (semi-) automatic approaches for registering the endoscopic image data with 3D anatomical data acquired pre-or intra-operatively are either marker-based (Baumhauer et al, 2008;Falk et al, 2005;Ieiri et al, 2011;Marvik et al, 2004;Megali et al, 2008;Mourgues et al, 2003;Simpfendorfer et al, 2011;Suzuki et al, 2008) or use external tracking devices that are initially calibrated with respect to the imaging modality (Ukimura and Gill, 2008;Konishi et al, 2007;Shekhar et al, 2010;Feuerstein et al, 2008;Konishi et al, 2007;Feuerstein et al, 2007;Leven et al, 2005;Blackall et al, 2000)).…”
Section: Intra-operative Registration For Augmented Reality Guidancementioning
confidence: 99%
“…However, transferring such plans from the pre-operative frame-of-reference to the dynamic intra-operative scene remains a necessary yet largely unsolved problem. To address this problem, many state-of-the-art methods rely on manual rigid alignment of pre-operative segmentation to intraoperative stereo data (after stereo surface reconstruction) followed by motion tracking [3], [4], [5]. Other works Fig.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, the use of biomechanical models can assist the registration of organs' internal structures such as vessels and tumours [8]. Augmenting the surgeons' view with these registered images is non-trivial, where some methods propose using a detailed mesh of the pre-operative model [3], while more recent works focus on selective visualization methods aimed at minimizing information overload [5].…”
Section: Introductionmentioning
confidence: 99%