2014
DOI: 10.1068/p7593
|View full text |Cite
|
Sign up to set email alerts
|

Timing of Speech and Display Affects the Linguistic Mediation of Visual Search

Abstract: Recent studies have shown that, instead, of a dichotomy between parallel and serial search strategies, in many instances we see a combination of both search strategies utilized. Consequently, computational models and theoretical accounts of visual search processing have evolved from traditional serial-parallel descriptions to a continuum from 'efficient' to 'inefficient' search. One of the findings, consistent with this blurring of the serial-parallel distinction, is that concurrent spoken linguistic input inf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…Language and vision are highly interactive—language input influences visual processing (e.g., Chiu & Spivey, 2014; Spivey & Marian, 1999; Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995), and visual input can influence language activation (e.g., Bles & Jansma, 2008; Görges, Oppermann, Jescheniak, & Schriefers, 2013; Meyer, Belke, Telling, & Humphreys, 2007; Morsella & Miozzo, 2002). Here, we show that linguistic information is automatically activated during picture processing, even when no language input is present, which carries implications for how speakers of different languages process the same visual scene.…”
mentioning
confidence: 99%
“…Language and vision are highly interactive—language input influences visual processing (e.g., Chiu & Spivey, 2014; Spivey & Marian, 1999; Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995), and visual input can influence language activation (e.g., Bles & Jansma, 2008; Görges, Oppermann, Jescheniak, & Schriefers, 2013; Meyer, Belke, Telling, & Humphreys, 2007; Morsella & Miozzo, 2002). Here, we show that linguistic information is automatically activated during picture processing, even when no language input is present, which carries implications for how speakers of different languages process the same visual scene.…”
mentioning
confidence: 99%
“…As predicted, the search process was significantly more efficient when the spoken target query was delivered concurrently with the display being visible, compared to when the query was delivered entirely before the display was made visible. (Not surprisingly, this effect is reduced with fast speech; B. S. Gibson, Eberhard, & Bryant, 2005) An interactive network simulation that had already demonstrated a continuum of efficient to inefficient visual search effects (Spivey & Dale, 2004) was enhanced to include this linguistic query delivery and accurately simulated results of triple conjunctions, fast speech, halted speech, and delayed speech (Chiu & Spivey, 2014; Reali, Spivey, Tyler, & Terranova, 2006).…”
Section: Language Comprehension Influences Visual Perceptionmentioning
confidence: 99%
“…Stroop color and word test scores may be affected by the impairment of speech motor function and the decline in cognitive flexibility (Ktaiche et al, 2022). Multiple regression analysis of Stroop word and Stroop color data suggested that Stroop word scores were predicted by the speed of visual search and Stroop color scores were predicted by the speed of visual search and working memory (Periáñez et al, 2021), both of which were related to the speed of visual search, and visual search is closely related to language comprehension (Huettig et al, 2011;Chiu and Spivey, 2014). According to our study, GMV decreases in the right MTG cluster in the PHC and CHTN-PE groups may lead to declines in language and visual comprehension, which could increase Stroop color and Stroop word scores.…”
Section: Figurementioning
confidence: 99%