2016 Ieee Sensors 2016
DOI: 10.1109/icsens.2016.7808566
|View full text |Cite
|
Sign up to set email alerts
|

Live demonstration: BioSleeve, a wearable hands-free gesture control interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 3 publications
0
5
0
Order By: Relevance
“…Many researchers have integrated haptic or auditory technology for navigational control, navigation assistance, or feedback of assistive mobility devices. Wearable devices such as the Jet Propulsion Laboratory BioSleeve [ 44 , 45 ], the wireless tongue drive system (TDS) to smartphone (iPhone) electric powered wheelchair (PWC; TDS to smartphone (iPhone) electric-PWC [TDS-iPhone-PWC]; [ 46 ]), and the MyoSuit [ 47 ] were designed using haptic technology for navigational control and aided mobility, respectively. The JPL BioSleeve is a wearable, hands-free gesture recognition interface that decodes as many as 20 discrete hand and finger gestures and can estimate the continuous pose of the arm.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Many researchers have integrated haptic or auditory technology for navigational control, navigation assistance, or feedback of assistive mobility devices. Wearable devices such as the Jet Propulsion Laboratory BioSleeve [ 44 , 45 ], the wireless tongue drive system (TDS) to smartphone (iPhone) electric powered wheelchair (PWC; TDS to smartphone (iPhone) electric-PWC [TDS-iPhone-PWC]; [ 46 ]), and the MyoSuit [ 47 ] were designed using haptic technology for navigational control and aided mobility, respectively. The JPL BioSleeve is a wearable, hands-free gesture recognition interface that decodes as many as 20 discrete hand and finger gestures and can estimate the continuous pose of the arm.…”
Section: Resultsmentioning
confidence: 99%
“…This information is obtained from the environment and user via a means of communication (usually an interface). USIs can send and receive information from the user (individuals with some form of disability) to the mobility aid via a communication channel that could be wired or wireless, such as the JPL BioSleeve [ 44 , 45 ] and the TDS-iPhone-PWC interface ( Table 3 ) [ 46 ] that can wirelessly control a mobility aid, the P300-based BCI ( Table 2 ) [ 74 ] that controls a wheelchair via a wired USB channel, and the autonomous scooter navigation mobility aid [ 13 ] that connects its computing module to its hard unit via a wired USB medium or a wireless Bluetooth medium. With the help of IoMT, interconnectivity between mobile devices and their environment and the storage or retrieval of relevant information for control, better autonomy, and monitoring are possible.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Electronic mobility cane (CVI, haptic, and auditory): liquid detection, 6 ultrasonic sensors, a metal detector, a microvibration motor, and a mono earphone [51] Has not yet been integrated and tested with assistive mobility aids to determine its applicability Intuitive control of robotic platforms by decoding as many as 20 discrete hand and finger gestures A multiclass support vector machine classifier Jet Propulsion Laboratory BioSleeve (haptic): electromyography and IMU e sensors [44,45] Fall and near-fall detection was not considered in its design and implementation.…”
Section: An Alternative Usi B For People With Spinal Cord Injury or U...mentioning
confidence: 99%