The literature reports the benefits of multimodal interaction with the maternal voice for preterm dyads in kangaroo care. Little is known about multimodal interaction and vocal modulation between preterm mother–twin dyads. This study aims to deepen the knowledge about multimodal interaction (maternal touch, mother’s and infants’ vocalizations and infants’ gaze) between a mother and her twin preterm infants (twin 1 [female] and twin 2 [male]) during speech and humming in kangaroo care. A microanalytical case study was carried out using ELAN, PRAAT, and MAXQDA software (Version R20.4.0). Descriptive and comparative analysis was performed using SPSS software (Version V27). We observed: (1) significantly longer humming phrases to twin 2 than to twin 1 (p = 0.002), (2) significantly longer instances of maternal touch in humming than in speech to twin 1 (p = 0.000), (3) a significant increase in the pitch of maternal speech after twin 2 gazed (p = 0.002), and (4) a significant increase of pitch in humming after twin 1 vocalized (p = 0.026). This exploratory study contributes to questioning the role of maternal touch during humming in kangaroo care, as well as the mediating role of the infant’s gender and visual and vocal behavior in the tonal change of humming or speech.
The following document describes the use of the acoustic reflectometry, the theory that follows it, and its implementation as a technique to measure cylindrical pipes. It describes how it was constructed and the materials used in the construction and implementation of the system that allows capturing signals (reflections) from the pipe under evaluation; also shown herein are the results obtained from measuring pipes of different lengths. This work also proposes simple application software that allows analyzing measurements made using acoustic reflectometry, as well as some conclusions and recommendations for future work.
Most studies of emotional responses have used unimodal stimuli (e.g., pictures or sounds) or congruent bimodal stimuli (e.g., video clips with sound), but little is known about the emotional response to incongruent bimodal stimuli. The aim of the present study was to evaluate the effect of congruence between auditory and visual bimodal stimuli on heart rate and self-reported measures of emotional dimension, valence and arousal. Subjects listened to pleasant, neutral, and unpleasant sounds, accompanied by videos with and without content congruence, and heart rate was recorded. Dimensions of valence and arousal of each bimodal stimulus were then self-reported. The results showed that heart rate depends of the valence of the sounds but not of the congruence of the bimodal stimuli. The valence and arousal scores changed depending on the congruence of the bimodal stimuli. These results suggest that the congruence of bimodal stimuli affects the subjective perception of emotion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.