2018
DOI: 10.46867/ijcp.2018.31.02.03
|View full text |Cite
|
Sign up to set email alerts
|

Hearing Parents’ Use of Auditory, Visual, and Tactile Cues as a Function of Child Hearing Status

Abstract: Parent-child dyads in which the child is deaf but the parent is hearing present a unique opportunity to examine parents’ use of non-auditory cues, particularly vision and touch, to establish communicative intent. This study examines the multimodal communication patterns of hearing parents during a free play task with their hearing (N=9) or deaf (N=9) children. Specifically, we coded parents’ use of multimodal cues in the service of establishing joint attention with their children. Dyad types were compared for … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
5
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 26 publications
1
5
0
1
Order By: Relevance
“…We have added the multimodality dimension of our coding in light of findings from both social (Baron-Cohen, 1991) and associative (Moore and Corkum, 1994) accounts. More recently, our approach is proving to be consistent with results obtained using eye-tracking (Yu & Smith, 2017a, 2017b, as well as those from observational coding (de Barbaro et al, 2016d), whose findings that infants are particularly responsive to their dyadic partners' hands is something we documented early on in our own coding Depowski et al, 2015;Gabouer, Oghalai, & Bortfeld, 2018. Starting with our initial observation of the importance of the caregiver's hands, we have iteratively fine-tuned our protocol with each data set to better characterize this phenomenon.…”
Section: Development Of Coding Protocolsupporting
confidence: 70%
See 4 more Smart Citations
“…We have added the multimodality dimension of our coding in light of findings from both social (Baron-Cohen, 1991) and associative (Moore and Corkum, 1994) accounts. More recently, our approach is proving to be consistent with results obtained using eye-tracking (Yu & Smith, 2017a, 2017b, as well as those from observational coding (de Barbaro et al, 2016d), whose findings that infants are particularly responsive to their dyadic partners' hands is something we documented early on in our own coding Depowski et al, 2015;Gabouer, Oghalai, & Bortfeld, 2018. Starting with our initial observation of the importance of the caregiver's hands, we have iteratively fine-tuned our protocol with each data set to better characterize this phenomenon.…”
Section: Development Of Coding Protocolsupporting
confidence: 70%
“…Multimodal cues are a powerful source of information for newborns and young infants in that auditory cues commonly result in visual attention (Kaplan & Werner, 1991;Mendelson et al, 1976). More recent studies support the general idea that multimodal information supports vocabulary development (Trueswell et al, 2016), establishment of category labels (Clark & Estigarribia, 2011), sustained attention (Suarez-Rivera et al, 2019), and joint attention (Gabouer et al, 2018(Gabouer et al, , 2020. By interrogating whether bids that consist of one sensory modality or various combinations of sensory modalities result in more or less joint attention, we can expand our understanding of infant development in general, and the influence of different interaction styles in particular.…”
Section: Tracking Multimodal Cuesmentioning
confidence: 79%
See 3 more Smart Citations