Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.383
|View full text |Cite
|
Sign up to set email alerts
|

Personalized Transformer for Explainable Recommendation

Abstract: Personalization of natural language generation plays a vital role in a large spectrum of tasks, such as explainable recommendation, review summarization and dialog systems. In these tasks, user and item IDs are important identifiers for personalization. Transformer, which is demonstrated with strong language modeling capability, however, is not personalized and fails to make use of the user and item IDs since the ID tokens are not even in the same semantic space as the words. To address this problem, we presen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 73 publications
(58 citation statements)
references
References 30 publications
0
34
0
1
Order By: Relevance
“…There exist various types of explanation style, such as pre-defined templates [24,49,60], item features [20,54], ranked text [5,12,27], image visualizations [10], knowledge graph paths [1,18,55,56], and reasoning rules [7,46,63], but in this work we focus on generating natural language explanations. Previous works [6,13,26,57] mostly rely on RNN, e.g., LSTM [22] and GRU [14], or unpretrained Transformer [29], leaving the potentially more effective pre-trained models under-explored, which motivates this work.…”
Section: Explainable Recommendationmentioning
confidence: 99%
See 3 more Smart Citations
“…There exist various types of explanation style, such as pre-defined templates [24,49,60], item features [20,54], ranked text [5,12,27], image visualizations [10], knowledge graph paths [1,18,55,56], and reasoning rules [7,46,63], but in this work we focus on generating natural language explanations. Previous works [6,13,26,57] mostly rely on RNN, e.g., LSTM [22] and GRU [14], or unpretrained Transformer [29], leaving the potentially more effective pre-trained models under-explored, which motivates this work.…”
Section: Explainable Recommendationmentioning
confidence: 99%
“…Personalization of natural language generation plays a vital role in a large spectrum of tasks, such as explainable recommendation [6,26,29], review summarization [23], and dialog systems [59,61]. In these tasks, user and item IDs are important identifiers for personalization.…”
Section: Personalized Natural Language Generationmentioning
confidence: 99%
See 2 more Smart Citations
“…Explainable AI has been an important topic in recommender systems [5,6,13,36,41,46,47], natural language processing [8,16,20] and computer vision [7,10,15,25,38]. To improve the transparency of deep neural networks, many explanation techniques have been proposed in recent years.…”
Section: Related Work 21 Explainability In Deep Learning and Aimentioning
confidence: 99%