2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07 2007
DOI: 10.1109/icassp.2007.367183
|View full text |Cite
|
Sign up to set email alerts
|

Structures for Spoken Language Understanding: A Two-Step Approach

Abstract: Spoken language understanding (SLU) aims to map a user's speech into a semantic frame. Since most of the previous works use the semantic structures for SLU, we verify that the structure is valuable even for noisy input. We apply a structured prediction method to SLU problem with comparison to unstructured one. In addition, we present a combined method to embed long-distance dependency between entities in a cascaded manner. On air travel data, we show that our approach improves performance over baseline models.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…Standard approaches to solving the slot filling problem include generative models, such as HMM/CFG composite models [31], [5], [53], hidden vector state (HVS) model [33], and discriminative or conditional models such as conditional random fields (CRFs) [6], [7], [32], [34], [40], [51], [54] and support vector machines (SVMs) [52]. Despite many years of research, the slot filling task in SLU is still a challenging problem, and this has motivated the recent application of a number of very successful continuous-space, neural net, and deep learning approaches, e.g.…”
mentioning
confidence: 99%
“…Standard approaches to solving the slot filling problem include generative models, such as HMM/CFG composite models [31], [5], [53], hidden vector state (HVS) model [33], and discriminative or conditional models such as conditional random fields (CRFs) [6], [7], [32], [34], [40], [51], [54] and support vector machines (SVMs) [52]. Despite many years of research, the slot filling task in SLU is still a challenging problem, and this has motivated the recent application of a number of very successful continuous-space, neural net, and deep learning approaches, e.g.…”
mentioning
confidence: 99%
“…Generative approaches designed for the slot-filling task includes the ones based on hidden markov models and context free grammar composite models like [13,23,34]. Conditional models designed for slot-filling based on conditional random fields (CRFs) include [6,7,12,27,[36][37][38]. In recent times, recurrent neural networks (RNNs) and convolutional neural networks (CNNs) have been applied to the slot-filling task, and examples of such methods include [11,20,21,33,38,40,41,43].…”
Section: Related Workmentioning
confidence: 99%