2023
DOI: 10.48550/arxiv.2302.02151
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Contrastive Collaborative Filtering for Cold-Start Item Recommendation

Zhihui Zhou,
Lilin Zhang,
Ning Yang

Abstract: The cold-start problem is a long-standing challenge in recommender systems. As a promising solution, content-based generative models usually project a cold-start item's content onto a warm-start item embedding to capture collaborative signals from item content so that collaborative filtering can be applied. However, since the training of the cold-start recommendation models is conducted on warm datasets, the existent methods face the issue that the collaborative embeddings of items will be blurred, which signi… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 32 publications
0
6
0
Order By: Relevance
“…The Thirty-Eighth AAAI Conference on Artificial Intelligence (AAAI-24) (Wei et al 2021) employ generative adversarial network (GAN) (Goodfellow et al 2020) and contrastive learning, respectively, to align the collaborative feature and the content feature of the same warm item. Whereas CFCCRec (Zhou, Zhang, and Yang 2023) samples two warm items with cooccurrence users, and aligns the content feature of one item to the collaborative feature of the other item. However, since cold items have no collaborative feature and co-occurrence users with warm items, these item-level alignment strategy based methods did not take this condition into consideration in the training stage, leading to limited performance for the cold items that firstly appear in the test stage, especially for the low similarity cold items, i.e., the cold items whose multimedia information is less similar to the warm items.…”
Section: And Clcrecmentioning
confidence: 99%
See 3 more Smart Citations
“…The Thirty-Eighth AAAI Conference on Artificial Intelligence (AAAI-24) (Wei et al 2021) employ generative adversarial network (GAN) (Goodfellow et al 2020) and contrastive learning, respectively, to align the collaborative feature and the content feature of the same warm item. Whereas CFCCRec (Zhou, Zhang, and Yang 2023) samples two warm items with cooccurrence users, and aligns the content feature of one item to the collaborative feature of the other item. However, since cold items have no collaborative feature and co-occurrence users with warm items, these item-level alignment strategy based methods did not take this condition into consideration in the training stage, leading to limited performance for the cold items that firstly appear in the test stage, especially for the low similarity cold items, i.e., the cold items whose multimedia information is less similar to the warm items.…”
Section: And Clcrecmentioning
confidence: 99%
“…In this paper, we focus on the second category, i.e., zeroshot item cold-start problem, where cold items are newly coming with no historical interactions. To address this problem, existing methods introduce the item's multimedia information to obtain the content feature as the representation of the cold item (Volkovs, Yu, and Poutanen 2017;Du et al 2020;Wang et al 2021;Wei et al 2021;Zhou, Zhang, and Yang 2023). Some of these methods integrate the item's collaborative feature into its content feature, and attempt to infer the integrated feature via randomly corrupting the collaborative feature of a subset of warm items in the training stage from the perspective of robust learning (Volkovs, Yu, and Poutanen 2017;Du et al 2020).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…CMP-PSP [50] effectively leveraged contrastive multi-view learning and pseudo-siamese networks to mitigate data sparsity and noisy interactions. CCFCRec [58] adopts contrastive collaborative filtering for cold-start item recommendation which applies contrastive learning to transfer the co-occurrence signals to the content CF module. KACL [40] performs contrastive learning across the user-item interaction view and KG view to include the knowledge graph in the recommendation while eliminating the noise it may introduce.…”
Section: Contrastive Learning For Recommendationmentioning
confidence: 99%