2003
DOI: 10.1613/jair.1063
|View full text |Cite
|
Sign up to set email alerts
|

Acquiring Word-Meaning Mappings for Natural Language Interfaces

Abstract: This paper focuses on a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a semantic lexicon from a corpus of sentences paired with semantic representations. The lexicon learned consists of phrases paired with meaning representations. Wolfie is part of an integrated system that learns to transform sentences into representations such as logical database queries.Experimental results are presented demonstrating Wolfie's ability to learn useful lexicons for a database interface in four differ… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
30
0

Year Published

2005
2005
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(31 citation statements)
references
References 57 publications
1
30
0
Order By: Relevance
“…Reinforcement learning can be used to learn to read instructions and perform actions in an external world (Branavan et al, 2009;Branavan et al, 2010;Vogel and Jurafsky, 2010). Other approaches have relied on access to more costly annotated logical forms (Zelle and Mooney, 1996;Thompson and Mooney, 2003;Wong and Mooney, 2006;Zettlemoyer and Collins, 2005;Kwiatkowski et al, 2010). These techniques have been generalized more recently to learn from sentences paired with indirect feedback from a controlled application.…”
Section: Situated Semantic Interpretationmentioning
confidence: 99%
“…Reinforcement learning can be used to learn to read instructions and perform actions in an external world (Branavan et al, 2009;Branavan et al, 2010;Vogel and Jurafsky, 2010). Other approaches have relied on access to more costly annotated logical forms (Zelle and Mooney, 1996;Thompson and Mooney, 2003;Wong and Mooney, 2006;Zettlemoyer and Collins, 2005;Kwiatkowski et al, 2010). These techniques have been generalized more recently to learn from sentences paired with indirect feedback from a controlled application.…”
Section: Situated Semantic Interpretationmentioning
confidence: 99%
“…There are a number of existing models of word learning (Cottrell & Plunkett, 1994;Elman et al, 1996;Farkas & Li, 2001;Gasser & Smith, 1998;Gupta & MacWhinney, 1997;Li & Farkas, 2002;MacWhinney, 1987;Merriman, 1999;Miikkulainen, 1997;Niyogi, 2002;Plaut, 1999;Plunkett et al, 1992;Regier, 1996;Roy & Pentland, 2002;Schafer & Mareschal, 2001;Siskind, 1992Siskind, , 1996Tenenbaum & Xu, 2000;Thompson & Mooney, 2003;Yu, Ballard, & Aslin, 2003). Several of these models account for one or more of the phenomena under consideration, for example, fast mapping (Gupta & MacWhinney, 1997;Niyogi, 2002;Siskind, 1996;Tenenbaum & Xu, 2000;Thompson & Mooney, 2003), changes in sensitivity to phonological cues (Schafer & Mareschal, 2001), the difficulty of learning second labels (Cottrell & Plunkett, 1994;MacWhinney, 1987;Merriman, 1999;Siskind, 1996;Thompson & Mooney, 2003), and the shape bias (Merriman, 1999).…”
Section: Overviewmentioning
confidence: 99%
“…Several of these models account for one or more of the phenomena under consideration, for example, fast mapping (Gupta & MacWhinney, 1997;Niyogi, 2002;Siskind, 1996;Tenenbaum & Xu, 2000;Thompson & Mooney, 2003), changes in sensitivity to phonological cues (Schafer & Mareschal, 2001), the difficulty of learning second labels (Cottrell & Plunkett, 1994;MacWhinney, 1987;Merriman, 1999;Siskind, 1996;Thompson & Mooney, 2003), and the shape bias (Merriman, 1999). However, the LEX model, which builds on this earlier work in word learning, and related work in categorization, is to my knowledge the first computational model that provides a unified account of all four of the developmental phenomena outlined previously.…”
Section: Overviewmentioning
confidence: 99%
“…Many supervised learning frameworks have been applied to the task of learning a semantic parser, including inductive logic programming (Zelle and Mooney, 1996;Thompson and Mooney, 1999;Thompson and Mooney, 2003), support vec-tor machine-based kernel approaches (Kate et al, 2005;Kate and Mooney, 2006;Kate and Mooney, 2007), machine translation-style synchronous grammars (Wong and Mooney, 2007), and context-free grammar-based approaches like probabilistic Combinatory Categorial Grammar (Zettlemoyer and Collins, 2005;Zettlemoyer and Collins, 2007;Zettlemoyer and Collins, 2009;Kwiatkowski et al, 2010;Kwiatkowski et al, 2011;Lu et al, 2008) and discriminative reranking (Ge and Mooney, 2006;Ge and Mooney, 2009). These approaches have yielded steady improvements on standard test sets like GeoQuery.…”
Section: Previous Workmentioning
confidence: 99%