Proceedings of the 11th Coference on Computational Linguistics - 1986
DOI: 10.3115/991365.991450
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning of morphological rules by generalization and analogy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

1989
1989
2011
2011

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 2 publications
0
4
0
Order By: Relevance
“…The properties (2), (3), and (4) are the same as in Kaplan et al [5], but XMAS has in addition the properties (5)- (9). XMAS shares the properties ( 3) and ( 7) with Wothke [15], but is…”
Section: Related Work and Concluding Remarksmentioning
confidence: 92%
“…The properties (2), (3), and (4) are the same as in Kaplan et al [5], but XMAS has in addition the properties (5)- (9). XMAS shares the properties ( 3) and ( 7) with Wothke [15], but is…”
Section: Related Work and Concluding Remarksmentioning
confidence: 92%
“…For example, the input to a past tense learner could be word pairs such as (in orthography) (walk, walked), (beat, beat), (eat, ate), etc. "Past tense" models of learning and processing have been a topic of strong interest in cognitive science (Rumelhart and McClelland 1986;Pinker and Prince 1988;Pinker 1999;Pinker and Ullmann 2002;McClelland and Patterson 2002), and many learning systems have been developed for this problem (Golding and Thompson 1985;Rumelhart and McClelland 1986;Wothke 1986;Ling 1994;Daelemans et al 1996;Mooney and Califf 1996;Molnar 2001;Clark 2001Clark , 2002Albright and Hayes 2002).…”
Section: Supervised Inflection Learning Modelsmentioning
confidence: 99%
“…A very different, more practically oriented, motivation for ULM came in the 1980s, beginning with the supervised morphology learning ideas by Wothke (1985Wothke ( , 1986 and Klenk (1985aKlenk ( , 1985b which later led to partly unsupervised methods (see the following). Because full natural language lexica, at the time, were too big to fit in working memory, these authors were looking for a way to analyze or stem running words in a "nichtlexikalisches" manner, that is, without the storage and use of a large lexicon.…”
Section: History and Motivation Of Ulmmentioning
confidence: 99%