2017
DOI: 10.1109/tkde.2017.2700392
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Contextual Attention Recurrent Neural Network for Map Query Suggestion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(8 citation statements)
references
References 26 publications
0
8
0
Order By: Relevance
“…An attention [41], [42] is intuited from visual attentions of human beings (incline to be attracted by more important parts of a target object). Attention is widely used in many fields, including object detection [43], [44], prediction [45], query suggestion [46], and recommendation [4]. In brief, attention can be used to increase the interpretability and adaptivity of complex models such as neural networks by calculating the weights of different data/information automatically.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…An attention [41], [42] is intuited from visual attentions of human beings (incline to be attracted by more important parts of a target object). Attention is widely used in many fields, including object detection [43], [44], prediction [45], query suggestion [46], and recommendation [4]. In brief, attention can be used to increase the interpretability and adaptivity of complex models such as neural networks by calculating the weights of different data/information automatically.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…We use attentive Bidirectional-Gated Recurrent Unit (Bi-GRU) [22] to learn the hidden representation. Gated Recurrent Unit (GRU) [22] is an Recurrent Neural Network based model that uses two gates z t and r t to control how information is updated to the state, where t stands for time, or more specifically, the order of words in our context. Bi-GRU considers both directions of a sentence for the better word context.…”
Section: B Feature Representation Learningmentioning
confidence: 99%
“…We utilize recurrent neural network (RNN) for sequential user modeling, as illustrated in Figure 3. Leveraging RNN for sequential modeling and time-series prediction has been widely applied in information retrieval systems [19,26,33]. Note that our methodology aims at final conversion estimation rather than sequential prediction for click at each touch point.…”
Section: Sequential Modelingmentioning
confidence: 99%
“…These methods are all based on the assumption that the user conversion would be driven by the individual advertising touch point of positive influence [24,37], which may not be realistic for the user journey (conversion funnel). In fact, sequential patterns within the user browsing behavior are of great value for response prediction or decision making in many fields such as recommender systems [23], information retrieval [26] and search advertising [36].…”
Section: Introductionmentioning
confidence: 99%