2017
DOI: 10.1016/j.jbi.2017.07.009
|View full text |Cite
|
Sign up to set email alerts
|

On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface

Abstract: Analyzing medical volume datasets requires interactive visualization so that users can extract anatomo-physiological information in real-time. Conventional volume rendering systems rely on 2D input devices, such as mice and keyboards, which are known to hamper 3D analysis as users often struggle to obtain the desired orientation that is only achieved after several attempts. In this paper, we address which 3D analysis tools are better performed with 3D hand cursors operating on a touchless interface comparative… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…Interaction with augmented reality included a variety of technologies that enable superimposed 3D representation of content and interaction with it.  Manipulating objects in VR/3D (Zimmerman et al, 1987, O'Hagan et al, 2002, New et al, 2003, Deller et al, 2006, Moustakas et al, 2009, Wright et al, 2011, Djukic et al, 2013, Jacob and Wachs, 2014, Kim and Park, 2014, Al-Sayegh and Makatsoris, 2015, Covarrubias et al, 2015, Lopes et al, 2017, NICOLA et al, 2017, Park and Lee, 2018, Togootogtokh et al, 2018, Vosinakis and Koutsabasis, 2018. These were systems that primarily focus on manipulation of objects in either 3D or VR specifically, rather than unique AR solutions, or applications specifically aimed at CAD.…”
Section: Manipulation/navigationmentioning
confidence: 99%
“…Interaction with augmented reality included a variety of technologies that enable superimposed 3D representation of content and interaction with it.  Manipulating objects in VR/3D (Zimmerman et al, 1987, O'Hagan et al, 2002, New et al, 2003, Deller et al, 2006, Moustakas et al, 2009, Wright et al, 2011, Djukic et al, 2013, Jacob and Wachs, 2014, Kim and Park, 2014, Al-Sayegh and Makatsoris, 2015, Covarrubias et al, 2015, Lopes et al, 2017, NICOLA et al, 2017, Park and Lee, 2018, Togootogtokh et al, 2018, Vosinakis and Koutsabasis, 2018. These were systems that primarily focus on manipulation of objects in either 3D or VR specifically, rather than unique AR solutions, or applications specifically aimed at CAD.…”
Section: Manipulation/navigationmentioning
confidence: 99%
“…Comparing this accuracy rate to the prior-art, to our knowledge most of the existing research on advanced image manipulation interfaces only report qualitative metrics or time of task completion (e.g., (9)(10)(11)) and lack quantitative analyses on spatial image control accuracy. However, compared to a study that implemented the closest counterpart to our 3D spatial accuracy metric (16) and despite the substantial differences in implementation of the metric and tasks, we found that our image control accuracy was better than the 3D target accuracy reported in that study (on average from 32.0 mm to 90.3 mm) despite the fact that our interface did not rely on any gesture recognition algorithms.…”
Section: Discussionmentioning
confidence: 97%
“…This was followed by several other publications that utilized image-based gesture recognition for medical image manipulation (10). More recently, conceptually similar approaches haven been introduced that provide the possibility of remote, touch-less interaction with medical imagery based on gesture recognition using depth (i.e., RGB-D) sensors (e.g., (11)(12)(13)(14)(15)(16)). Performing such gestures requires certain movements of either one or both hands, rendering such technologies limited for interventions where both of the surgeons' hands are occupied.…”
Section: Related Workmentioning
confidence: 99%
“…For example, hand gesture based interaction is used to reduce the need for user training prior to application use (Buchmann et al, 2004, Kim et al, 2005a, Beyer and Meier, 2011. Sensor based hand detection enables touchless operation guaranteeing sterility or safer interaction in the medical field (Lopes et al, 2017). Hand gesture or full body gesture interaction supports immersion using Virtual Reality (VR) or Augmented Reality (AR) (Deller et al, 2006).…”
Section: Effect Of Recent Advancements In Technology On Gesture Researchmentioning
confidence: 99%
“…Reviews show that they have been extensively researched since the 80s, as a natural and intuitive means of interaction between humans and computerised systems (Pisharady and Saerbeck, 2015, Rautaray and Agrawal, 2015, Santos¹ et al, 2015, Milani et al, 2017, Al-Shamayleh et al, 2018. Hand gestures are explored as a means that allows for touchless interaction providing specific benefits such as hygiene (Lopes et al, 2017) undivided attention focused on the main task (Riener et al, 2013), a more natural fit for interaction with household items (Dinh et al, 2014), large displays (Foehrenbach et al, 2009), or interaction with three-dimensional (3D) objects in Virtual Reality (VR)/Augmented Reality (AR) supported spaces Lee, 2016, Memo andZanuttigh, 2018).…”
Section: Introductionmentioning
confidence: 99%