2004
DOI: 10.1080/13506280344000248
|View full text |Cite
|
Sign up to set email alerts
|

Eye movements during speech planning: Talking about present and remembered objects

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2007
2007
2018
2018

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 26 publications
0
14
0
Order By: Relevance
“…Participants have completed a CLU, and presumably they look around the picture for the next situation to describe and simultaneously plan the next utterance (Meyer et al, 2004). This is supported by a number of studies concerning behavioural correlates of the precuneus (see Cavanna and Trimble (2006) for a recent review).…”
Section: Conceptual Planningmentioning
confidence: 98%
“…Participants have completed a CLU, and presumably they look around the picture for the next situation to describe and simultaneously plan the next utterance (Meyer et al, 2004). This is supported by a number of studies concerning behavioural correlates of the precuneus (see Cavanna and Trimble (2006) for a recent review).…”
Section: Conceptual Planningmentioning
confidence: 98%
“…Monitoring eye movements has become an invaluable method for psychologists who are studying many aspects of cognitive processing, including reading, language processing, language production, memory, and visual attention (Cherubini, Nüssli, & Dillenbourg, 2008;Duchowski, 2003;Griffin, 2004;Griffin & Oppenheimer, 2006;Meyer & Dobel, 2003;Meyer, van der Meulen, & Brooks, 2004;Rayner, 1998;Spivey & Geng, 2001;Trueswell & Tanenhaus, 2005; G. Van Gompel, Fischer, Murray, & Hill, 2007). Although recent technological advances have made eyetracking hardware increasingly robust and suitable for more active scenarios (Land, 2006(Land, , 2007, current software can register gaze only in terms of predefined, static regions of the screen.…”
Section: P Smentioning
confidence: 99%
“…It is not entirely clear how far looking must precede speech to influence its contents. Griffin (2004), Bock, Irwin, and Davidson (2004), and Meyer, van der Meulen, and Brooks (2004) cite latencies ranging from 300 to 1400 ms between initial gaze at a stimulus and naming or describing it. Latencies between a verbal contribution to dialogue and the next verbal turn averaged 418 ms in a co-present version of the map task (Bull, 1998).…”
Section: Discussionmentioning
confidence: 97%