Recent evidence suggests that preverbal infants' gaze following can be triggered only if an actor's head turn is preceded by the expression of communicative intent [1]. Such connectedness between ostensive and referential signals may be uniquely human, enabling infants to effectively respond to referential communication directed to them. In the light of increasing evidence of dogs' social communicative skills [2], an intriguing question is whether dogs' responsiveness to human directional gestures [3] is associated with the situational context in an infant-like manner. Borrowing a method used in infant studies [1], dogs watched video presentations of a human actor turning toward one of two objects, and their eye-gaze patterns were recorded with an eye tracker. Results show a higher tendency of gaze following in dogs when the human's head turning was preceded by the expression of communicative intent (direct gaze, addressing). This is the first evidence to show that (1) eye-tracking techniques can be used for studying dogs' social skills and (2) the exploitation of human gaze cues depends on the communicatively relevant pattern of ostensive and referential signals in dogs. Our findings give further support to the existence of a functionally infant-analog social competence in this species.
There is growing evidence that dog-directed and infant-directed speech have similar acoustic characteristics, like high overall pitch, wide pitch range, and attention-getting devices. However, it is still unclear whether dog- and infant-directed speech have gender or context-dependent acoustic features. In the present study, we collected comparable infant-, dog-, and adult directed speech samples (IDS, DDS, and ADS) in four different speech situations (Storytelling, Task solving, Teaching, and Fixed sentences situations); we obtained the samples from parents whose infants were younger than 30 months of age and also had pet dog at home. We found that ADS was different from IDS and DDS, independently of the speakers’ gender and the given situation. Higher overall pitch in DDS than in IDS during free situations was also found. Our results show that both parents hyperarticulate their vowels when talking to children but not when addressing dogs: this result is consistent with the goal of hyperspeech in language tutoring. Mothers, however, exaggerate their vowels for their infants under 18 months more than fathers do. Our findings suggest that IDS and DDS have context-dependent features and support the notion that people adapt their prosodic features to the acoustic preferences and emotional needs of their audience.
Robots offer new possibilities for investigating animal social behaviour. This method enhances controllability and reproducibility of experimental techniques, and it allows also the experimental separation of the effects of bodily appearance (embodiment) and behaviour. In the present study we examined dogs’ interactive behaviour in a problem solving task (in which the dog has no access to the food) with three different social partners, two of which were robots and the third a human behaving in a robot-like manner. The Mechanical UMO (Unidentified Moving Object) and the Mechanical Human differed only in their embodiment, but showed similar behaviour toward the dog. In contrast, the Social UMO was interactive, showed contingent responsiveness and goal-directed behaviour and moved along varied routes. The dogs showed shorter looking and touching duration, but increased gaze alternation toward the Mechanical Human than to the Mechanical UMO. This suggests that dogs’ interactive behaviour may have been affected by previous experience with typical humans. We found that dogs also looked longer and showed more gaze alternations between the food and the Social UMO compared to the Mechanical UMO. These results suggest that dogs form expectations about an unfamiliar moving object within a short period of time and they recognise some social aspects of UMOs’ behaviour. This is the first evidence that interactive behaviour of a robot is important for evoking dogs’ social responsiveness.
The effects of emotionally valenced events on sleep physiology are well studied in humans and laboratory rodents. However, little is known about these effects in other species, despite the fact that several sleep characteristics differ across species and thus limit the generalizability of such findings. Here we studied the effect of positive and negative social experiences on sleep macrostructure in dogs, a species proven to be a good model of human social cognition. A non-invasive polysomnography method was used to collect data from pet dogs ( = 16) participating in 3-hour-long sleep occasions. Before sleep, dogs were exposed to emotionally positive or negative social interactions (PSI or NSI) in a within-subject design. PSI consisted of petting and ball play, while NSI was a mixture of separation, threatening approach and still face test. Sleep macrostructure was markedly different between pre-treatment conditions, with a shorter sleep latency after NSI and a redistribution of the time spent in the different sleep stages. Dogs' behaviour during pre-treatments was related to the macrostructural difference between the two occasions, and was further modulated by individual variability in personality. This result provides the first direct evidence that emotional stimuli affect subsequent sleep physiology in dogs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.