2007
DOI: 10.1075/sll.10.2.06fen
|View full text |Cite
|
Sign up to set email alerts
|

Seeing sentence boundaries

Abstract: Linguists have suggested that non-manual and manual markers are used in sign languages to indicate prosodic and syntactic boundaries. However, little is known about how native signers interpret non-manual and manual cues with respect to sentence boundaries. Six native signers of British Sign Language (BSL) were asked to mark sentence boundaries in two narratives: one presented in BSL and one in Swedish Sign Language (SSL). For comparative analysis, non-signers undertook the same tasks. Results indicated that b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
34
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(37 citation statements)
references
References 22 publications
3
34
0
Order By: Relevance
“…That study also aims to identify the visual information signers rely on to determine utterance boundaries online, on the basis of linguistic annotations of the visible cues in this additional data set. On a par with previous work on spoken languages , we hypothesize that in addition to lexical content and syntax , phonetic and prosodic markers such as signing speed or height (Wilbur, 2009;Russell et al, 2011), as well as visual intonation on the f a c em a yp l a yar o l e ( Reilly et al, 1990;Nespor and Sandler, 1999;Fenlon et al, 2007;Dachkovsky and Sandler, 2009;Dachkovsky et al, 2013) in the online prediction of stroke-tostroke turn boundaries. This paper has centered on question-answer sequences within a relatively limited data set.…”
Section: Discussionmentioning
confidence: 58%
“…That study also aims to identify the visual information signers rely on to determine utterance boundaries online, on the basis of linguistic annotations of the visible cues in this additional data set. On a par with previous work on spoken languages , we hypothesize that in addition to lexical content and syntax , phonetic and prosodic markers such as signing speed or height (Wilbur, 2009;Russell et al, 2011), as well as visual intonation on the f a c em a yp l a yar o l e ( Reilly et al, 1990;Nespor and Sandler, 1999;Fenlon et al, 2007;Dachkovsky and Sandler, 2009;Dachkovsky et al, 2013) in the online prediction of stroke-tostroke turn boundaries. This paper has centered on question-answer sequences within a relatively limited data set.…”
Section: Discussionmentioning
confidence: 58%
“…For example, researchers are exploring whether speakers use gesture in spoken language prosody, called audio-visual prosody (Krahmer & Swerts 2005, 2007); these patterns can then be compared with the patterns found in sign language prosody to explore similarities and differences. As another example, researchers are exploring the ability of nonsigners (who have only their skills as gesturers to draw on) to understand patterns in sign language—sign language prosody (Fenlon et al 2007, Brentari et al 2011) and sign language structures that encode notions of telicity (Strickland et al 2015). Researchers are also exploring whether hearing people who have experience using cospeech gesture will have a different “accent” when they sign, not because they learned to sign as a second language but simply because they have had extensive cospeech gesture experience.…”
Section: Surprising Results and Further Implications Of Work On Lamentioning
confidence: 99%
“…In fact, Fenlon et al (2007) found that native signers and non-signers were able to identify sentence boundaries in a sign language with which they were unfamiliar. Using video clips of narratives in an unfamiliar sign language, Fenlon and colleagues asked participants to push a button when they thought they saw a sentence boundary.…”
Section: Discussionmentioning
confidence: 99%
“…We coded three manual features – holds, repetitions, and emphatic movements – each of which has been shown to play a role in sign language prosody (Nespor & Sandler 1999; Fenlon et al 2007; Wilbur & Martínez 2002). As described earlier, dropping the hand and/or pausing (i.e., a relax in handshape without subsequently dropping the hands) were used to identify the ends of utterances in our coding system (Goldin-Meadow & Mylander 1984).…”
Section: 4 Coding Prosody In Homesignmentioning
confidence: 99%