2020 IEEE International Conference on Human-Machine Systems (ICHMS) 2020
DOI: 10.1109/ichms49158.2020.9209521
|View full text |Cite
|
Sign up to set email alerts
|

Emulated haptic shared control for brain-computer interfaces improves human-robot cooperation

Abstract: Today, technology provides many ways for humans to exchange their points of view about pretty much everything. Visual, audio and tactile media are most commonly used by humans, and they support communication in such a natural way that we don't even actively think about using them. But what about people who have lost motor or sensory capabilities for whom it is difficult or impossible to control or perceive the output of such technologies? In this case, perhaps the only way to communicate might be to use brain … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 18 publications
0
1
0
1
Order By: Relevance
“…Gestures not only allow front end interaction with others with no language barrier but also remote interactions with a smart screen, virtual-reality and augmented-reality objects. Digital computing is made possible thanks to gesture-based human-computer interfaces [1] from everyday life physical objects like lights, mirrors, doorknobs, notebooks to the specific and focused tasks like driving a car or a medical surgery. Gesture study helps to further understand semantic meaning with contextual information about a person's behaviour and reactions.…”
Section: Introductionmentioning
confidence: 99%
“…Gestures not only allow front end interaction with others with no language barrier but also remote interactions with a smart screen, virtual-reality and augmented-reality objects. Digital computing is made possible thanks to gesture-based human-computer interfaces [1] from everyday life physical objects like lights, mirrors, doorknobs, notebooks to the specific and focused tasks like driving a car or a medical surgery. Gesture study helps to further understand semantic meaning with contextual information about a person's behaviour and reactions.…”
Section: Introductionmentioning
confidence: 99%
“…• La información del robot se puede recibir del ambiente a través del uso de sensores (ejemplo, ultrasónico, giroscopio, de contacto) así como sensores de alto rendimiento (de temperatura, aceleración, consumo de energía), recopilación de datos inteligente y en tiempo real para poder conocer el estado de la máquina (Fujishima et al, 2017;M. P. Pacaux-Lemoine et al, 2020;Zhou et al, 2019).…”
Section: Capa De Conexiónunclassified