2019
DOI: 10.1109/access.2019.2895647
|View full text |Cite
|
Sign up to set email alerts
|

A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units With Attention Function

Abstract: In this paper, we investigate an attention function combined with the gated recurrent units (GRUs), named GRUA, to raise the accuracy of the customer preference prediction. The attention function extracts the important product features by using the time-bias parameter and the term frequency-inverse document frequency parameter for recommending products to a customer in the ongoing session. We show that the attention function with the GRUs can learn the customer's intention in the ongoing session more precisely… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 23 publications
0
9
0
Order By: Relevance
“…Although a few researches explore workers' preferences for tasks in crowdsourcing [1], [2], [45], they just infer workers' preferences from past task-performing patterns or explicit feedbacks without considering workers' temporal preferences for different tasks. In other research areas, e.g., personalized product recommendation, some work uses new techniques of machine learning to learn users' preferences [3]. For example, Quadrana et al [30] introduce the usage of Gated Recurrent Units (GRUs) with the collaborative filtering method for learning users' preferences, where GRU is a new generation of Recurrent Neural Network (RNN) and regarded as a variant of Long Short-Term Memory (LSTM) network [9].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although a few researches explore workers' preferences for tasks in crowdsourcing [1], [2], [45], they just infer workers' preferences from past task-performing patterns or explicit feedbacks without considering workers' temporal preferences for different tasks. In other research areas, e.g., personalized product recommendation, some work uses new techniques of machine learning to learn users' preferences [3]. For example, Quadrana et al [30] introduce the usage of Gated Recurrent Units (GRUs) with the collaborative filtering method for learning users' preferences, where GRU is a new generation of Recurrent Neural Network (RNN) and regarded as a variant of Long Short-Term Memory (LSTM) network [9].…”
Section: Related Workmentioning
confidence: 99%
“…GRUs can reduce the computational burden and perform as good as LSTM [10]. [3] learns a user's preferences from his/her ongoing session (such as the user's behaviors, meta data of the products, etc.) by using the Gated Recurrent Units (GRUs) with attention function.…”
Section: Related Workmentioning
confidence: 99%
“…proposed two various algorithms to improve the recommendation performance [21]. Beside that Chen et al examined an attention function merged with the gated recurrent to increase the accuracy of the customer preference [22]. According to the research work achieved by [23], it obtained that the content recommender system limitation was in the early stage, which mainly related to insufficient data.…”
Section: Related Workmentioning
confidence: 99%
“…RNNs remain popular modeling tools for session data in newer works with a number of modifications and improvements made to the original RNN architecture, e.g. using more suitable objective functions [17], applying data augmentation techniques [34], incorporating content features [15], [26], or adding attention mechanisms to capture the user's purpose [7], [21]. Other deep learning architectures have also been explored to model session sequences.…”
Section: B Session-based Recommendationsmentioning
confidence: 99%