Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management 2014
DOI: 10.1145/2661829.2662052
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Multi-Relational Factorization Models for Multiple Target Relations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(26 citation statements)
references
References 17 publications
0
26
0
Order By: Relevance
“…For each of the 12 features, we train a logistic regression classifier (we use logistic regression as a proof-of-concept), resulting in 12 classification models for predicting an individual as either depressed or non-depressed (anxious or non-anxious). We consider an additional predictive model -the most accurate HIN-based RS model from our previous work (Section 1) [28], DMF [13]. The above dynamic network-based classification models need to be superior to DMF for it to make sense to incorporate in the future the dynamic network data into the HIN-based framework, per the discussion in Section 1.…”
Section: Task 2: Do Individuals Who Have Different Evolving Network Pmentioning
confidence: 99%
“…For each of the 12 features, we train a logistic regression classifier (we use logistic regression as a proof-of-concept), resulting in 12 classification models for predicting an individual as either depressed or non-depressed (anxious or non-anxious). We consider an additional predictive model -the most accurate HIN-based RS model from our previous work (Section 1) [28], DMF [13]. The above dynamic network-based classification models need to be superior to DMF for it to make sense to incorporate in the future the dynamic network data into the HIN-based framework, per the discussion in Section 1.…”
Section: Task 2: Do Individuals Who Have Different Evolving Network Pmentioning
confidence: 99%
“…In addition to these graph-oriented recommendation algorithms, we used the original DMF model restricted to only the direct network relations, to demonstrate the bene t of adding extended paths. Finally, we also used another multi-relational factorization model, CATSMF, using direct links of the network [6]. e CATSMF model [6] was introduced to improve the e ciency of the DMF model by limiting the parameters needed for the auxiliary relations by coupling them together.…”
Section: Comparison Algorithmsmentioning
confidence: 99%
“…Finally, we also used another multi-relational factorization model, CATSMF, using direct links of the network [6]. e CATSMF model [6] was introduced to improve the e ciency of the DMF model by limiting the parameters needed for the auxiliary relations by coupling them together.…”
Section: Comparison Algorithmsmentioning
confidence: 99%
“…This method has shown state-of-the-art performances on relational datasets [26], although the number of relation types is usually modest (less than 100). Extensions have recently been proposed in [9], [15] to handle multirelational data with a large number of relation types.…”
Section: Multi-relational Learningmentioning
confidence: 99%
“…The latter treat multi-relational information (i.e., multigraph) as constraints to the learning process, whereas ours casts multirelational information into multi-relational features that in turn serve as additional information to be augmented into the learning process. Finally, CSL is more generic/flexible than the matrix/tensor factorization methods [7], [9], [13], [15], [26], [29]. These methods rely on low-rank assumption for matrix/tensor decomposition, and do not yet cater for explicit (i.e., non-latent) features defined for each entity.…”
Section: Our Approachmentioning
confidence: 99%