2010
DOI: 10.3758/app.72.8.2236
|View full text |Cite
|
Sign up to set email alerts
|

Redundant spoken labels facilitate perception of multiple items

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

4
51
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(55 citation statements)
references
References 39 publications
4
51
0
Order By: Relevance
“…This is due to our higher order conceptual representations (in this case color category labels) shaping the output of color perception. The categorical perception of color is just one example of this phenomenon, as categorical perception has been observed for various other natural stimuli, including faces and trained novel objects (Goldstone, Steyvers, & Rogosky, 2003; Goldstone, 1994; Goldstone, Lippa, & Shiffrin, 2001; Levin & Beale, 2000; Livingston, Andrews, & Harnad, 1998; Lupyan, Thompson-Schill, & Swingley, 2010; Newell & Bulthoff, 2002; Sigala, Gabbiani, & Logothetis, 2002). …”
Section: Theoretical Foundations: Categorical Perceptionmentioning
confidence: 90%
See 1 more Smart Citation
“…This is due to our higher order conceptual representations (in this case color category labels) shaping the output of color perception. The categorical perception of color is just one example of this phenomenon, as categorical perception has been observed for various other natural stimuli, including faces and trained novel objects (Goldstone, Steyvers, & Rogosky, 2003; Goldstone, 1994; Goldstone, Lippa, & Shiffrin, 2001; Levin & Beale, 2000; Livingston, Andrews, & Harnad, 1998; Lupyan, Thompson-Schill, & Swingley, 2010; Newell & Bulthoff, 2002; Sigala, Gabbiani, & Logothetis, 2002). …”
Section: Theoretical Foundations: Categorical Perceptionmentioning
confidence: 90%
“…Indeed, recent findings demonstrate that even non-informative, redundant labels can influence visual processing in striking ways (Lupyan & Spivey, 2010; Lupyan & Thompson-Schill, 2012). Lupyan & Thompson-Schill (2012) found that performance on an orientation discrimination task was facilitated when the image was preceded by the auditory presentation of a verbal label, but not by a sound that was equally associated with the object.…”
Section: Theoretical Foundations: Categorical Perceptionmentioning
confidence: 99%
“…16 The uncertainty inherent in language promotes the formation-in both developmental time and in the moment-of representations that represent category diagnostic information and abstract over idiosyncratic information. The consequences of these more categorical representations are substantial, spanning basic perceptual tasks (Lupyan, 2008;Lupyan & Spivey, 2010a, 2010bLupyan & Ward, 2013;Lupyan & Spivey, 2008) and higher level reasoning (Lupyan, 2015).…”
Section: Knowledge Through Language Versus Knowledge Through Perceptimentioning
confidence: 99%
“…Although many cognitive scientists naturally assume that language and vision are independent, modular systems each with their own representational primitives and operations, most also acknowledge that the systems interact, supporting the uniquely human capacity to talk about what we see (Jackendoff, 1987; MacNamara, 1978; ). The broader question of how language interacts with vision and other cognitive systems has, for many decades, motivated scientists to ask about the role of language in influencing attention(Egeth& Smith, 1967; Gleitman et al, 2007; Lupyan & Spivey, 2010; Lupyan, 2008; Papafragou, Hulbert, & Trueswell, 2008; Smith, Jones, & Landau, 1996; Spivey, Tyler, Eberhard,& Tanenhaus, 2001), in sustaining conceptual categories and creating new ones (Roberson & Davidoff, 2000; Roberson, Davies, & Davidoff, 2000; Yoshida & Smith, 2003), and most radically, in creating new systems of representation (Carey, 2009; Hermer-Vazquez, Spelke,& Katsnelson, 1999). Implicit in all of these studies is the drive to understand the mechanisms by which language interacts with non-linguistic representations, and what the end result of the interactions is.…”
mentioning
confidence: 99%