2017 12th IEEE International Conference on Automatic Face &Amp; Gesture Recognition (FG 2017) 2017
DOI: 10.1109/fg.2017.93
|View full text |Cite
|
Sign up to set email alerts
|

What Makes a Gesture a Gesture? Neural Signatures Involved in Gesture Recognition

Abstract: Previous work in the area of gesture production, has made the assumption that machines can replicate "humanlike" gestures by connecting a bounded set of salient points in the motion trajectory. Those inflection points were hypothesized to also display cognitive saliency. The purpose of this paper is to validate that claim using electroencephalography (EEG). That is, this paper attempts to find neural signatures of gestures (also referred as placeholders) in human cognition, which facilitate the understanding, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
6
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 20 publications
2
6
0
Order By: Relevance
“…The Occipital Cluster showed a decrease in EEG power at 10 Hz ( gesture classes (75%) showed a mu peak even before any inflection points were registered in the gesture class with an average response of 300ms. This is consistent with previous results found in a preliminary study reported by Cabrera et al (2017). Additionally, a previous EEG study also shows mu suppression within the first second of gesture viewing, regardless whether the gesture was communicative or meaningless in nature; this is a general response within the motor cortex distinct from visual cortical response apparent at occipital electrodes (Streltsova et al, 2010).…”
Section: Descriptive Analysissupporting
confidence: 92%
See 1 more Smart Citation
“…The Occipital Cluster showed a decrease in EEG power at 10 Hz ( gesture classes (75%) showed a mu peak even before any inflection points were registered in the gesture class with an average response of 300ms. This is consistent with previous results found in a preliminary study reported by Cabrera et al (2017). Additionally, a previous EEG study also shows mu suppression within the first second of gesture viewing, regardless whether the gesture was communicative or meaningless in nature; this is a general response within the motor cortex distinct from visual cortical response apparent at occipital electrodes (Streltsova et al, 2010).…”
Section: Descriptive Analysissupporting
confidence: 92%
“…The purpose of this work is to analyze neural signatures from electroencephalographic (EEG) data recorded while observing gestures, to determine the relationship between key motion components of each gesture with oscillations in the mu frequency band, associated to the motor cortex. The present work is an extension of the analysis presented in Cabrera et al (2017), in which a preliminary analysis was conducted on a reduced dataset to determine the existence of a relationship between salient kinematic points in observed gestures with oscillations in EEG potentials associated to both motor and visual cortices activation. The current work focuses on extending the analysis in order to corroborate this relationship.…”
Section: Introductionmentioning
confidence: 99%
“…This lag is consistent with the notion that IPs may be utilized as place holders involved in conscious gesture categorization. This is the first evidence (from cognitive, objective, and empirical studies) that these specific landmarks represent stronger footprints in people's memory than other points (Cabrera et al, 2017a).…”
Section: Methodsmentioning
confidence: 66%
“…The variability among the examples did not come from the robotic performance, but the method used to generate the gesture instances. In a broader research perspective, the use of a robotic platform to recognize the artificially generated gestures opens the possibility to study the coherency in recognition between humans and machines, alternating the roles of executing and recognizing the gestures (Cabrera et al, 2017b).…”
Section: Robotic Implementationmentioning
confidence: 99%
“…Verbal and non-verbal communication processes are fundamental in individuals’ lives, characterizing their daily interactions and conveying information with different purposes [ 1 ]. Specifically, while words connote verbal interactions, non-verbal interactions are mediated by the use of gestures that can be considered facilitators of social interactions [ 2 , 3 , 4 , 5 ] via their influence on communicative meaning and interpersonal exchange dynamics [ 6 ].…”
Section: Introductionmentioning
confidence: 99%