2020
DOI: 10.3390/app10072441
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning Architecture for Collaborative Filtering Recommender Systems

Abstract: This paper provides an innovative deep learning architecture to improve collaborative filtering results in recommender systems. It exploits the potential of the reliability concept to raise predictions and recommendations quality by incorporating prediction errors (reliabilities) in the deep learning layers. The underlying idea is to recommend highly predicted items that also have been found as reliable ones. We use the deep learning architecture to extract the existing non-linear relations between predictions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
45
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 88 publications
(45 citation statements)
references
References 34 publications
0
45
0
Order By: Relevance
“…According to algorithm principles, CF algorithm can be mainly divided into two types: memory-based CF and model-based CF [3,5], where the former seeks for neighbors having high similarity to user or commodity mainly through user's historical information and predicts user purchasing behaviors according to their comprehensive commodity evaluation. Memory-based CF has data sparsity problem with weak extensibility [13]. Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts user purchasing probability [14].…”
Section: Related Work 21 Collaborative Filtering Algorithmmentioning
confidence: 99%
“…According to algorithm principles, CF algorithm can be mainly divided into two types: memory-based CF and model-based CF [3,5], where the former seeks for neighbors having high similarity to user or commodity mainly through user's historical information and predicts user purchasing behaviors according to their comprehensive commodity evaluation. Memory-based CF has data sparsity problem with weak extensibility [13]. Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts user purchasing probability [14].…”
Section: Related Work 21 Collaborative Filtering Algorithmmentioning
confidence: 99%
“…According to algorithm principles, CF algorithm can be mainly divided into two types: memory-based CF and model-based CF [1,2] , where the former seeks for neighbors having high similarity to user or commodity mainly through user's historical information and predicts user purchasing behaviors according to their comprehensive commodity evaluation. Memory-based CF has data sparsity problem with weak extensibility [9] . Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts user purchasing probability [10] .…”
Section: Related Work 21 Collaborative Filtering Algorithmmentioning
confidence: 99%
“…8. N L l    (9) Slicing input data is the key to realizing multi-granularity deep forest. Assume "sliding window" size as   (10) Slice size is lL WW  , and slice number of the whole dataset is: (11) After multi-granularity scanning, the slices of all input data are input into the random forest If there are RF n random forests, the quantity of class vectors generated through "sliding window" sampling is: (13) The final output data size after multi-granularity scanning is: (14) Fig .…”
Section: Fig 6 Structural Illustration Of Cascaded Deep Forestmentioning
confidence: 99%
See 1 more Smart Citation
“…Today, with the concept of deep learning becoming more important, several researchers have begun to test its usability in a collaborative filtering approach in order to achieve better results [20]. For example, Van den Oord et al made use of deep convolutional neural network to give music recommendations [21].…”
Section: Introductionmentioning
confidence: 99%