2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society 2009
DOI: 10.1109/iembs.2009.5332720
|View full text |Cite
|
Sign up to set email alerts
|

Ultrasound guided robotic biopsy using augmented reality and human-robot cooperative control

Abstract: Ultrasound-guided biopsy is a proficient mininvasive approach for tumors staging but requires very long training and particular manual and 3D space perception abilities of the physician, for the planning of the needle trajectory and the execution of the procedure. In order to simplify this difficult task, we have developed an integrated system that provides the clinician two types of assistance: an augmented reality visualization allows accurate and easy planning of needle trajectory and target reaching verifi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…In the proposed paper, the robot provides just an assistance, without performing any operation in autonomy or in teleoperation. Augmented reality and robotic systems have been combined in different works for performing a precise needle insertion in radiofrequency ablation [18], [19] and in ultrasoundguided biopsy [20]. However, in these papers the robotic system is used to perform autonomously some parts of the operation (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…In the proposed paper, the robot provides just an assistance, without performing any operation in autonomy or in teleoperation. Augmented reality and robotic systems have been combined in different works for performing a precise needle insertion in radiofrequency ablation [18], [19] and in ultrasoundguided biopsy [20]. However, in these papers the robotic system is used to perform autonomously some parts of the operation (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…As of today, such mixed reality environments are mostly based on high-resolution anatomical CT images that allow construction of a 3-D virtual space (or simulator) through which the surgeon navigates with robotic assistance for training purposes [14][15][16][17][18][19] or for actually performing the surgical procedure [20,21]. In addition to CT images, anatomical information utilized as an input for novel simulation/planning surgical approaches may include ultrasound imaging [22]. Efforts should therefore be undertaken to include also tomographic radionuclide images (as typically obtained with either PET/CT and/or SPECT/CT) for stereoscopic visualization of a complete medical data set that would include both structural/anatomical information and metabolic/functional information.…”
mentioning
confidence: 99%
“…Near-infrared binocular vision can reconstruct the arm in three dimensions, but it cannot measure the depth information of veins. [26][27][28] Doppler ultrasound can locate vessels by sensing the blood flow and observing the venipuncture in real time, [29][30][31][32] but the complexity of image processing of ultrasound instruments restricts their application. 33 The…”
Section: Phantom Venipuncture Experimentsmentioning
confidence: 99%