2019
DOI: 10.1109/access.2019.2906659
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning of Paragraph Embeddings for Context-Aware Recommendation

Abstract: The sparsity of data is one of the main reasons restricting the performance of recommender systems. In order to solve the sparsity problem, some recommender systems use auxiliary information, especially text information, as a supplement to increase the prediction accuracy of the ratings. However, the two mainstream approaches based on text analysis have some limitations. The bag-of-words-based model is one of them, being difficult to use the contextual information of the paragraph effectively so that only the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 26 publications
(33 reference statements)
0
7
0
Order By: Relevance
“…After that, the attention mechanism also began to be applied to the field of machine translation, [12] was the first to apply the attention mechanism in the field of machine translation, used the attention mechanism to solve the problem of aligning source languages of different lengths in machine translation, and performed machine translation while aligning the source languages, which led to a significant improvement in the performance of neural network machine translation models, demonstrating the usefulness of the attention mechanism in the field of natural language processing. e attention mechanism was introduced based on [20], and the sentence modeling was implemented using the convolutional neural network with the addition of the attention mechanism [21]. Later, the Google machine translation team fused the attention mechanism and the sequence converter network [22] to achieve better text translation, which is different from the previous combination of attention mechanism and recurrent neural network or CNN, and proved the feasibility of the fusion method of multiple attention mechanisms.…”
Section: Attentional Mechanismsmentioning
confidence: 99%
“…After that, the attention mechanism also began to be applied to the field of machine translation, [12] was the first to apply the attention mechanism in the field of machine translation, used the attention mechanism to solve the problem of aligning source languages of different lengths in machine translation, and performed machine translation while aligning the source languages, which led to a significant improvement in the performance of neural network machine translation models, demonstrating the usefulness of the attention mechanism in the field of natural language processing. e attention mechanism was introduced based on [20], and the sentence modeling was implemented using the convolutional neural network with the addition of the attention mechanism [21]. Later, the Google machine translation team fused the attention mechanism and the sequence converter network [22] to achieve better text translation, which is different from the previous combination of attention mechanism and recurrent neural network or CNN, and proved the feasibility of the fusion method of multiple attention mechanisms.…”
Section: Attentional Mechanismsmentioning
confidence: 99%
“…The current grouting studies are basically for small and medium-sized shield structures, and the effect of gravity on the grouting pressure is not considered, but the effect of gravity on the grouting pressure of large-diameter shield structures cannot be ignored. Therefore, it is necessary to analyze the construction process of post-wall grouting of large-diameter mud and water shield [ 15 ].…”
Section: Introductionmentioning
confidence: 99%
“…A context-aware recommendation system for estimating user preferences by sequential predictions was proposed in [57]. Moreover, a contextaware recommendation model named paragraph vector matrix factorization which integrates the unsupervised learning of paragraph embeddings into probabilistic matrix factorization was proposed in [58]. The proposed model can be used to capture the semantic information of the paragraph and can improve the prediction accuracy of the ratings.…”
Section: Matrix Factorization and Context-aware Recommender Systemmentioning
confidence: 99%