2019
DOI: 10.1109/tnsre.2019.2894517
|View full text |Cite
|
Sign up to set email alerts
|

Using the Motion of the Head-Neck as a Joystick for Orientation Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…During the Baseline conditions, participants used the exoskeletons with the motors unpowered. We have previously shown that the Columbia Exo is very back-drivable when unpowered [23]. The observed EMG during Baseline sessions using the two devices were similar, which means the structural improvement in Utah Exo did not increase the effort to back drive the exoskeleton when the motors were unpowered.…”
Section: Columbia Exo Mean Rating (Median) Interquartile Rangementioning
confidence: 80%
See 1 more Smart Citation
“…During the Baseline conditions, participants used the exoskeletons with the motors unpowered. We have previously shown that the Columbia Exo is very back-drivable when unpowered [23]. The observed EMG during Baseline sessions using the two devices were similar, which means the structural improvement in Utah Exo did not increase the effort to back drive the exoskeleton when the motors were unpowered.…”
Section: Columbia Exo Mean Rating (Median) Interquartile Rangementioning
confidence: 80%
“…During the Baseline and Visual conditions, the visual feedback was provided through an avatar interface [22] [23]. On this interface, two avatars were overlaid: one with solid colors was used to provide motion instructions ("target avatar") and the other with translucent colors was used to reflect the actual head-neck motions of the participant, measured by the IMU sensors.…”
Section: Protocolmentioning
confidence: 99%
“…Head-neck interfaces have the potential to command and control orientation tasks when the hand-wrist is not available for use as a joystick. In [109], the authors reported the use of a human-like avatar to analyze the trajectories of the head based on actual positions and target in a visual interface (computer screen). The robotic neck brace was validated for both head motion measurements and for providing physical assistance through servomotors.…”
Section: Sensors 2020 20 X For Peer Review 15 Of 28mentioning
confidence: 99%
“…Movement coding for control keyboards and displays for patients with ALS and people with upper and lower limbs palsy EEG/EOG/facial EMG/inertial [103,113,114] Control and implementation of tasks and human-machine interfaces like wheelchair, smart shoe, and robot Inertial/flex sensor/camera/ultrasonic/EOG/EEG/ Kinect/force/torque/FRS/infrared [107][108][109][110][111][112]120,128] Emotion recognition for patients with palsy, autism spectrum disorder Camera/movement/sound/infrared [115,[117][118][119] Gesture recognition for aid communication between deaf people and listeners Flex sensor/inertial/EMG [122][123][124][125][126][127]…”
Section: Main Application Sensors Referencesmentioning
confidence: 99%
“…disorders [5]. In addition, hands-free interaction with HCI can help people with impairments integrate into the workforce, such as through orientation controls that allow impaired people to participate in daily activities [6]. A head-mounted inertial interface was employed in [7], for patients with cerebral palsy.…”
Section: Introductionmentioning
confidence: 99%