2022
DOI: 10.1016/j.ipm.2022.103067
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised learning for Conversational Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…The backbone of DNAGPT is a transformer-based [29] auto-regressive [30] decoder with the masked self-attention [31] module. To better deal with numerical information, we pre-train the DNA sequence and numerical property end to end in a single model.…”
Section: Dnagpt Architecture 21 Model Structurementioning
confidence: 99%
“…The backbone of DNAGPT is a transformer-based [29] auto-regressive [30] decoder with the masked self-attention [31] module. To better deal with numerical information, we pre-train the DNA sequence and numerical property end to end in a single model.…”
Section: Dnagpt Architecture 21 Model Structurementioning
confidence: 99%
“…Conversational search (CS) is applied in many fields such as recommendation systems, e-health and personality recognition ( Aliannejadi et al, 2020 ; Velicia-Martin et al, 2021 ; Shen et al., 2023 ). Deep learning solutions for CS have replaced more traditional rule-based approaches ( Onal et al, 2018 ; Gao, Galley & Li, 2018 ; Li et al., 2022 ). A main challenge is how to ensure conversational context-awareness ( Vtyurina et al, 2017 ).…”
Section: Literature Reviewmentioning
confidence: 99%
“…In contrastive learning, a pair of positive and negative examples are used to learn representations by comparing them [30]. Typically, positive-negative example pairs consist of semantically related neighbors.…”
Section: Contrastive Learningmentioning
confidence: 99%