2018
DOI: 10.1109/tkde.2018.2821174
|View full text |Cite
|
Sign up to set email alerts
|

A General Framework for Implicit and Explicit Social Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(16 citation statements)
references
References 25 publications
0
16
0
Order By: Relevance
“…Alternating Least Squares has been widely applied to many models, especially matrix completion/approximation. Nonnegative matrix factorization (NMF) is an important topic in matrix completion/approximation, and it became a highlight when it was applied to recommendation systems in a Netflix contest [36][37][38]. At present, NMF has developed multiple variants, including (i) regularization based on L 1 -norms, L 2 -norms [39], L 2,1 -norms, nuclear norms, mixed norms, and graphs, (ii) different loss functions like Huber loss, the correntropy induced metric [40,41], Cauchy functions [42], and Truncated Cauchy functions [43,44], and (iii) many more, such as projected gradient NMF [45], projective NMF [46,47], and orthogonal NMF [48].…”
Section: Nmfimputementioning
confidence: 99%
“…Alternating Least Squares has been widely applied to many models, especially matrix completion/approximation. Nonnegative matrix factorization (NMF) is an important topic in matrix completion/approximation, and it became a highlight when it was applied to recommendation systems in a Netflix contest [36][37][38]. At present, NMF has developed multiple variants, including (i) regularization based on L 1 -norms, L 2 -norms [39], L 2,1 -norms, nuclear norms, mixed norms, and graphs, (ii) different loss functions like Huber loss, the correntropy induced metric [40,41], Cauchy functions [42], and Truncated Cauchy functions [43,44], and (iii) many more, such as projected gradient NMF [45], projective NMF [46,47], and orthogonal NMF [48].…”
Section: Nmfimputementioning
confidence: 99%
“…The role of SNs in the recommendation procedure is examined in a large-scale field experiment that randomized exposure to friend information signals to ascertain the relative role of weak and strong ties [3], [12]. In addition, the work in [13] develops a way to combine SN information and CF methods for improved recommendation accuracy, through methods for selecting neighbors and boosting data from friends using user preference ratings and user SN relations collected from SN sources.…”
Section: Related Workmentioning
confidence: 99%
“…Paradoxically, if we only focus on solving the noise problem, the data will become more sparse [6]. On the contrary, if we only focus on solving the data sparse problem, the data will have more noise [16], [17]. Therefore, a few studies attempt to solve these shortcomings simultaneously [9], [15], identify reliable trust relationships in trust information by finding explicit connections between social relationships in a supervised manner, for example, Yu et al [15] proposed a deep adversarial framework, using multiple triangular heterogeneity relationships between users and trust friends to generate explicit user trust features, then GAN will generate reliable trust data based on these explicit trust features.…”
Section: Introductionmentioning
confidence: 99%