According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., “decide,” “sell,” “die”) encode a logical endpoint, whereas atelic verbs (e.g., “think,” “negotiate,” “run”) do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1–5, nonsigning English speakers accurately distinguished between telic (e.g., “decide”) and atelic (e.g., “think”) signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7–10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible “mapping biases” between telicity and visual form.
The occurrence of WH -items at the right edge of the sentence, while extremely rare in spoken languages, is quite common in sign languages. In particular, in sign languages like LIS (Italian Sign Language) WH -items cannot be positioned at the left edge. We argue that existing accounts of right-peripheral occurrences of WH -items are empirically inadequate and provide no clue as to why sign languages and spoken languages differ in this respect. We suggest that the occurrence of WH -items at the right edge of the sentence in sign languages be taken at face value: in these languages, WH -phrases undergo rightward movement. Based on data from LIS, we argue that this is due to the fact that WH-NONMANUAL MARKING (NMM) marks the dependency between an interrogative complementizer and the position that the WH -phrase occupies before it moves. The hypothesis that NMM can play this role also accounts for the spreading of negative NMM with LIS negative quantifiers. We discuss how our analysis can be extended to ASL (American Sign Language) and IPSL (Indo-Pakistani Sign Language). Our account is spelled out in the principles-and-parameters framework. In the last part of the article, we relate our proposal to recent work on prosody in spoken languages showing that WH -dependencies can be prosodically marked in spoken languages. Overt movement and prosodic marking of the WH -dependency do not normally cooccur in spoken languages, while they are possible in sign languages. We propose that this is due to the fact that sign languages, unlike spoken languages, are multidimensional.
Confronted with the loss of one type of sensory input, we compensate using information conveyed by other senses. However, losing one type of sensory information at specific developmental times may lead to deficits across all sensory modalities. We addressed the effect of auditory deprivation on the development of tactile abilities, taking into account changes occurring at the behavioral and cortical level. Congenitally deaf and hearing individuals performed two tactile tasks, the first requiring the discrimination of the temporal duration of touches and the second requiring the discrimination of their spatial length. Compared with hearing individuals, deaf individuals were impaired only in tactile temporal processing. To explore the neural substrate of this difference, we ran a TMS experiment. In deaf individuals, the auditory association cortex was involved in temporal and spatial tactile processing, with the same chronometry as the primary somatosensory cortex. In hearing participants, the involvement of auditory association cortex occurred at a later stage and selectively for temporal discrimination. The different chronometry in the recruitment of the auditory cortex in deaf individuals correlated with the tactile temporal impairment. Thus, early hearing experience seems to be crucial to develop an efficient temporal processing across modalities, suggesting that plasticity does not necessarily result in behavioral compensation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.