2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR) 2020
DOI: 10.1109/ismcr51255.2020.9263775
|View full text |Cite
|
Sign up to set email alerts
|

3D Head Pointer: A manipulation method that enables the spatial localization for a wearable robot arm by head bobbing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…In particular, understanding to which body part users attribute their perception of the wearable robotic limbs (perceptual attribution) under synchronization with multiple body parts is crucial for learning manipulation and designing sensory feedback. Previous studies have suggested using head motion (Iwasaki and Iwata, 2018;Oh et al, 2020;Sakurada et al, 2022) and other entities' motion (Hagiwara et al, 2021) as reference body parts for the weighted average method. However, the perceptual attribution of the weighted average of the hand and foot has not been thoroughly investigated.…”
Section: Users' Perceptual Attribution Of a Wearable Robotic Limb As ...mentioning
confidence: 99%
“…In particular, understanding to which body part users attribute their perception of the wearable robotic limbs (perceptual attribution) under synchronization with multiple body parts is crucial for learning manipulation and designing sensory feedback. Previous studies have suggested using head motion (Iwasaki and Iwata, 2018;Oh et al, 2020;Sakurada et al, 2022) and other entities' motion (Hagiwara et al, 2021) as reference body parts for the weighted average method. However, the perceptual attribution of the weighted average of the hand and foot has not been thoroughly investigated.…”
Section: Users' Perceptual Attribution Of a Wearable Robotic Limb As ...mentioning
confidence: 99%
“…The objective is to enable users to manipulate these additional robotic limbs with the same naturalness and freedom as their own limbs. Noteworthy contributions have emerged in this field, including the coordination of WRL movements with human limbs using wearable sensors for overhead tasks [12], the control of external limbs through foot and shoulder sensors for everyday tasks [13,14], the use of eye tracking sensors for precise control of WRLs [15], the utilization of chest and abdominal muscles to control WRLs through electromyography (EMG)-based sensors [16], the classification of human intentions towards external limbs through electroencephalography (EEG) [17,18], the integration of human movement with foot EMG for task assistance [19], and shared control strategies that align human intent with robot autonomy [9,17,20]. These contributions have advanced the control of WRLs, yet challenges still remain, such as limited adaptability to different applications, potential constraints on human movement, and stability-related issues.…”
Section: Introductionmentioning
confidence: 99%
“…At present, there have been studies involving prototype structure design [5][6][7][8][9], joint control technology [4,6,8,10], human-machine collaborative motion planning [11], SRL operation methods [12][13][14][15][16][17], human-machine dynamics modeling [18][19][20], etc. Current operation methods for SRL include EMG mapping [12], EEG mapping [13], foot mapping [14,15], head mapping [16], and eyeball mapping [17]. Tese operation methods have a common problem of visual area switching.…”
Section: Introductionmentioning
confidence: 99%