2020
DOI: 10.1016/j.sigpro.2019.107366
|View full text |Cite
|
Sign up to set email alerts
|

Deep latent factor model for collaborative filtering

Abstract: Latent factor models have been used widely in collaborative filtering based recommender systems. In recent years, deep learning has been successful in solving a wide variety of machine learning problems. Motivated by the success of deep learning, we propose a deeper version of latent factor model. Experiments on benchmark datasets shows that our proposed technique significantly outperforms all stateof-the-art collaborative filtering techniques.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
17
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(29 citation statements)
references
References 34 publications
0
17
0
1
Order By: Relevance
“…Memory-based CF has data sparsity problem with weak extensibility [13]. Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts userpurchasing probability [14]. At present, many model-based CF algorithms have been put forward, where matrix decomposition has gradually become the mainstream method in modelbased CF algorithms by virtue of transformation of high-dimensional sparse user rating data and excellent extensibility [12].…”
Section: Collaborative Filtering Algorithmmentioning
confidence: 99%
“…Memory-based CF has data sparsity problem with weak extensibility [13]. Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts userpurchasing probability [14]. At present, many model-based CF algorithms have been put forward, where matrix decomposition has gradually become the mainstream method in modelbased CF algorithms by virtue of transformation of high-dimensional sparse user rating data and excellent extensibility [12].…”
Section: Collaborative Filtering Algorithmmentioning
confidence: 99%
“…Memory-based CF has data sparsity problem with weak extensibility [13]. Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts user purchasing probability [14]. At present many model-based CF algorithms have been put forward, where matrix decomposition has gradually become the mainstream method in model-based CF algorithms by virtue of transformation of highdimensional sparse user rating data and excellent extensibility [12].…”
Section: Related Work 21 Collaborative Filtering Algorithmmentioning
confidence: 99%
“…Memory-based CF has data sparsity problem with weak extensibility [9] . Different from memory-based CF, model-based CF trains model mainly through user ratings for commodities and then predicts user purchasing probability [10] . At present many model-based CF algorithms have been put forward, where matrix decomposition has gradually become the mainstream method in model-based CF algorithms by virtue of transformation of high-dimensional sparse user rating data and excellent extensibility [8] .…”
Section: Related Work 21 Collaborative Filtering Algorithmmentioning
confidence: 99%
“…N L l    (9) Slicing input data is the key to realizing multi-granularity deep forest. Assume "sliding window" size as   (10) Slice size is lL WW  , and slice number of the whole dataset is: (11) After multi-granularity scanning, the slices of all input data are input into the random forest If there are RF n random forests, the quantity of class vectors generated through "sliding window" sampling is: (13) The final output data size after multi-granularity scanning is: (14) Fig . 9 shows the overall process of cascaded deep forest with three "sliding windows", the sizes of which are d/16, d/8 and d/4, respectively, where d represents feature number.…”
Section: Fig 6 Structural Illustration Of Cascaded Deep Forestmentioning
confidence: 99%
See 1 more Smart Citation