Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems 2019
DOI: 10.1145/3290605.3300445
|View full text |Cite
|
Sign up to set email alerts
|

Stroke-Gesture Input for People with Motor Impairments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 31 publications
(7 citation statements)
references
References 76 publications
0
7
0
Order By: Relevance
“…An extensive literature exists on assistive technology for users with motor impairments and a variety of computing devices, from desktop PCs [30,31] to tabletops [61], mobile devices [49,60,63,105], and wearables [55][56][57]. This literature has reported user performance with a wide range of input modalities, from touch input [31,38,61] to gesture [13,85,91], voice [18,40,41], eye gaze [16,46,70,102,103], and brain-computer input [28,62]. To mention a few examples, Smart Touch [61] is an accurate template matching technique designed to improve the performance of users with upper body motor impairments when selecting targets on touchscreens; Programming by Voice [40] is an interface that enables users with motor impairments to operate programming environments by speaking the code instead of using the mouse and keyboard; and EyeWrite [46] is a technique designed for eye-based text entry using letter-like gestures [100].…”
Section: Assistive Input Technology For Users With Motor Impairmentsmentioning
confidence: 99%
See 3 more Smart Citations
“…An extensive literature exists on assistive technology for users with motor impairments and a variety of computing devices, from desktop PCs [30,31] to tabletops [61], mobile devices [49,60,63,105], and wearables [55][56][57]. This literature has reported user performance with a wide range of input modalities, from touch input [31,38,61] to gesture [13,85,91], voice [18,40,41], eye gaze [16,46,70,102,103], and brain-computer input [28,62]. To mention a few examples, Smart Touch [61] is an accurate template matching technique designed to improve the performance of users with upper body motor impairments when selecting targets on touchscreens; Programming by Voice [40] is an interface that enables users with motor impairments to operate programming environments by speaking the code instead of using the mouse and keyboard; and EyeWrite [46] is a technique designed for eye-based text entry using letter-like gestures [100].…”
Section: Assistive Input Technology For Users With Motor Impairmentsmentioning
confidence: 99%
“…Such prior developments have been possible by means of careful analysis and understanding of the accessibility problems encountered by people with motor impairments in the physical world; see Anthony et al [5], Kane et al [49], Naftali and Findlater [63], and Mott et al [60] for examples of studies unveiling accessibility issues, interaction challenges, coping strategies, and adaptations adopted by people with motor impairments to use input devices and user interfaces. Regarding input on mobile devices with touchscreens, Vatavu and Ungurean [91] released the largest dataset of stroke gestures collected from users with motor impairments, with which they reported results regarding user performance (e.g., production time) and system performance (e.g., gesture recognition accuracy).…”
Section: Assistive Input Technology For Users With Motor Impairmentsmentioning
confidence: 99%
See 2 more Smart Citations
“…Although being rare contributions overall, datasets have been released in other communities of HCI, such as data regarding gestures articulated by people with upperbody motor impairments on touchscreen mobile devices[39], or the VizWiz datasets (https://vizwiz.org) meant to foster development of computer vision algorithms for assistive technologies and people who are blind.…”
mentioning
confidence: 99%