2018
DOI: 10.1016/j.procs.2018.10.341
|View full text |Cite
|
Sign up to set email alerts
|

Deep Bi-Directional LSTM Network for Query Intent Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(6 citation statements)
references
References 7 publications
0
5
0
Order By: Relevance
“…− Tracking of the volume of demand: With such a solution in place it could be understood what keyword category causes issues and subsequent abilities to zoom in to more specific areas. In general, analysts would have granular control over the volume and demands of the searches on the domains of the organization [16]. − Keyword level information: With query categorization in place, keyword-level information could be managed efficiently [17].…”
Section: Need For Query Categorisationmentioning
confidence: 99%
“…− Tracking of the volume of demand: With such a solution in place it could be understood what keyword category causes issues and subsequent abilities to zoom in to more specific areas. In general, analysts would have granular control over the volume and demands of the searches on the domains of the organization [16]. − Keyword level information: With query categorization in place, keyword-level information could be managed efficiently [17].…”
Section: Need For Query Categorisationmentioning
confidence: 99%
“…Intent detection methods based on DL techniques can be classified into methods using convolution neural networks [10], recurrent neural networks [11] and their variants (LSTMs and GRUs) [4,7], the Bidirectional Long short-term Memory (BLSTM) self-attention model [12], the capsule network model [13], the method of joint recognition [14], the use of distances to measure the text similarities (such as TF-IDF) [15], or methods combining several DL models [16,17]. Nonetheless, the appearance of BERT (Bidirectional Encoder Representations from Transformers) models greatly contributed to enhance natural language processing [18].…”
Section: Literature Reviewmentioning
confidence: 99%
“…This is one of the main tasks for the NLU module in order to detect the dialogues acts in the user utterance and provide this information to the dialogue manager. A range of statistical techniques have been proposed to complete this task, from traditional techniques, such as Naive Bayes (McCallum & Nigam, 1998) or Support Vector Machines (Haffner, Tur, & Wright, 2003), to mainstream methods, such as Convolutional Neural Networks (Zhang, Song, Liu, Du, & Zhao, 2017), word embeddings (Liu & Lane, 2016) recurrent neural networks (Firdaus, Kumar, Ekbal, & Bhattacharyya, 2019), long short‐term memory networks (Sreelakshmi, Rafeeque, Sreetha, & Gayathri, 2018) or gated recurrent units (Mauajama Firdaus, B, & Golchha, 2020).…”
Section: State Of the Art: User‐adapted Conversational Interfacesmentioning
confidence: 99%