2019 IST-Africa Week Conference (IST-Africa) 2019
DOI: 10.23919/istafrica.2019.8764816
|View full text |Cite
|
Sign up to set email alerts
|

Glove Based Sign Interpreter for Medical Emergencies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 5 publications
0
2
0
Order By: Relevance
“…On the other hand, the capability of translating to and from sign language by mobile systems using machine learning and electromyography to recognize the signs [29] or instrumented gloves (smart gloves) which will be thoroughly explained in the upcoming section and leverage depth cameras, including the development of animated avatars to help to show the signs. This technology will make it more accessible to communicate between deaf and hearing people, especially during emergencies, [30] when available certified interpreters can be hard to find. At the same time, paramedics and doctors don't have the ability to sign and communicate.…”
Section: Figure 3 the Deaf Model Of The Communication Processmentioning
confidence: 99%
“…On the other hand, the capability of translating to and from sign language by mobile systems using machine learning and electromyography to recognize the signs [29] or instrumented gloves (smart gloves) which will be thoroughly explained in the upcoming section and leverage depth cameras, including the development of animated avatars to help to show the signs. This technology will make it more accessible to communicate between deaf and hearing people, especially during emergencies, [30] when available certified interpreters can be hard to find. At the same time, paramedics and doctors don't have the ability to sign and communicate.…”
Section: Figure 3 the Deaf Model Of The Communication Processmentioning
confidence: 99%
“…This method achieves the alternation of hand movement, but it is not possible to obtain the changes of hand shapes. In [21][22][23], the authors combined both curvature and velocity sensors, but the classification of hand gesture categories was performed by the conventional comparison structures with simple instruction sets of microcontrollers (e.g. ARV, 8051 with compare and check condition instructions), without evaluation or survey changing of parameters of sensors or machine learning algorithms.…”
Section: Introductionmentioning
confidence: 99%