2019
DOI: 10.1145/3297277
|View full text |Cite
|
Sign up to set email alerts
|

Developing a Hand Gesture Recognition System for Mapping Symbolic Hand Gestures to Analogous Emojis in Computer-Mediated Communication

Abstract: Recent trends in computer-mediated communication (CMC) have not only led to expanded instant messaging through the use of images and videos but have also expanded traditional text messaging with richer content in the form of visual communication markers (VCMs) such as emoticons, emojis, and stickers. VCMs could prevent a potential loss of subtle emotional conversation in CMC, which is delivered by nonverbal cues that convey affective and emotional information. However, as the number of VCMs grows in the select… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(5 citation statements)
references
References 58 publications
0
5
0
Order By: Relevance
“…Besides, conventional means of accessing visual communication markers (VCM) rely on input entry methods that are not directly and intimately tied to expressive nonverbal cues. Koh et al [14] addressed this issue, by facilitating the use of an alternative form of VCM entry: hand gestures. Moreover, to fill this gap Dai et al [7] presented a training system, called CAPG-MYO, for userdefined hand gesture interaction.…”
Section: User Interfaces For User-defined Hand Gesturementioning
confidence: 99%
“…Besides, conventional means of accessing visual communication markers (VCM) rely on input entry methods that are not directly and intimately tied to expressive nonverbal cues. Koh et al [14] addressed this issue, by facilitating the use of an alternative form of VCM entry: hand gestures. Moreover, to fill this gap Dai et al [7] presented a training system, called CAPG-MYO, for userdefined hand gesture interaction.…”
Section: User Interfaces For User-defined Hand Gesturementioning
confidence: 99%
“…An earlier survey regarding smartglasses input defines four categories for classifying input techniques on smartglasses [1]. These categories are: on-device interaction [64], on-body interaction [75] [95], hands-free interaction (e.g., gaze [88] and whole-body gestures [93] [96] for emoji inputs [140]), and freehand interaction with wearable cameras [86] or sensors inside the closed environment of a vehicle [120], showing clear boundaries among the four categories. Nevertheless, as reflected in the most recent works, these boundaries are becoming more blurred.…”
Section: Emerging Hybrid Interfacesmentioning
confidence: 99%
“…The coexistence of tactile feedbacks and on-skin devices Designing tactile feedbacks with epidermal devices [140] On-body Computer vision (Headset) Similarity between body gestures and emjois Emjois input [72] On-body Tactile Alternative user perceptions other than visual and audio loads Feel-through feedbacks on sensitive skin surface [98] On-body Tactile Alternative user perceptions other than visual and audio loads Feel-through feedbacks on skin hair [84] On-body Radar Micro-gestures between fingers of one hand Gesture-to-command [78] On-body Infrared Thumb-to-fingertip microgestures (e.g., circle, triangle, rub, etc.) Gesture-to-command [86] On-body Computer vision (Wrist-worn) Micro-gestures between fingers of one hand Gesture-to-command [120] On-body Computer vision Intuitive Hand Gestures for controlling in-vehicle interfaces Gesture-to-command [171] On-body Computer vision Learning new hand gestures Gesture-to-command [71] On-body Electrodes and circuits Touch on sensitive and spacious skin surface, accomodating various gestures Gesture-to-command [81] On-body Near-field Communication (NFC) Enabling the spacious skin surface on human body as control devices Gesture-to-command [99] On-body RFID Micro-gestures driven by thumb-to-fingertip interaction Gesture-to-command [91,92] On-body Electrodes and circuits Touch events between two hands of two people In-city social events by Interpersonal Interaction [69] On-body Fluidic material Alternative user perceptions other than visual and audio loads Information display and notifications [85] On-body LED lights The spacious skin surface becomes swift and easy-to-reach channels for notifications Information display and notifications [80] On-body Touch-sensitive surface Miniature-size interface on a nail and fingertip-to-nail interaction…”
Section: Readabilitymentioning
confidence: 99%
“…In recent years, the research of emoji has attracted more and more attention. Emojis are studied in some computermediated communications [29], such as twitter [30,31] and posts [32][33][34].…”
Section: Emoji In Sentiment Analysismentioning
confidence: 99%