2019 International Conference on Control, Automation and Diagnosis (ICCAD) 2019
DOI: 10.1109/iccad46983.2019.9037927
|View full text |Cite
|
Sign up to set email alerts
|

Human machine interface based on virtual reality for programming industrial robots

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Conventionally control terminals in the interactive humanmachine interface (HMI), such as a touchpad, push button, and keyboard, have been constructed in a more diversified and creative manner. Usually, the HMI involves sensors that can detect external stimuli such as pressure [1][2][3][4], temperature [5][6][7], and strain [8][9][10][11][12], and they can provide feedback to the users through virtual displays [12][13][14][15] and robot interfaces [12,[16][17][18][19]. Among the stimulus, pressure sensing is of high interest due to its diverse HMI applications, including game control [20,21], soft robotic [22,23], gesture recognition [24][25][26], self-powered accelerometers [27], wearable electronics and machine vibration monitoring [28], and motion-balanced sensor [29].…”
Section: Introductionmentioning
confidence: 99%
“…Conventionally control terminals in the interactive humanmachine interface (HMI), such as a touchpad, push button, and keyboard, have been constructed in a more diversified and creative manner. Usually, the HMI involves sensors that can detect external stimuli such as pressure [1][2][3][4], temperature [5][6][7], and strain [8][9][10][11][12], and they can provide feedback to the users through virtual displays [12][13][14][15] and robot interfaces [12,[16][17][18][19]. Among the stimulus, pressure sensing is of high interest due to its diverse HMI applications, including game control [20,21], soft robotic [22,23], gesture recognition [24][25][26], self-powered accelerometers [27], wearable electronics and machine vibration monitoring [28], and motion-balanced sensor [29].…”
Section: Introductionmentioning
confidence: 99%