Proceedings of the 30th ACM International Conference on Information &Amp; Knowledge Management 2021
DOI: 10.1145/3459637.3482243
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Learning of User Behavior Sequence for Context-Aware Document Ranking

Abstract: Context information in search sessions has proven to be useful for capturing user search intent. Existing studies explored user behavior sequences in sessions in different ways to enhance query suggestion or document ranking. However, a user behavior sequence has often been viewed as a definite and exact signal reflecting a user's behavior. In reality, it is highly variable: user's queries for the same intent can vary, and different documents can be clicked. To learn a more robust representation of the user be… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(12 citation statements)
references
References 63 publications
0
12
0
Order By: Relevance
“…Since then, however, it has been an important resource to the research community (e.g., [25,30]). Even to this day, the AOL Query Log continues to enable studies in analysis of data leaks [13], search autocompletion [14], weak supervision for adhoc search [7,21], search result personalisation [9,16], and session-based search [1,2,6,28,33].…”
Section: Introductionmentioning
confidence: 99%
“…Since then, however, it has been an important resource to the research community (e.g., [25,30]). Even to this day, the AOL Query Log continues to enable studies in analysis of data leaks [13], search autocompletion [14], weak supervision for adhoc search [7,21], search result personalisation [9,16], and session-based search [1,2,6,28,33].…”
Section: Introductionmentioning
confidence: 99%
“…Our model is also related to contrastive learning, which has boosted unsupervised representation learning in recommendation [31,40], computer vision [21,30], and natural language processing [29,43]. In text generation, Logeswaran and Lee [27] proposed to learn better sentence representations by using a classifier to distinguish context sentences from other contrastive sentences.…”
Section: Related Workmentioning
confidence: 99%
“…Most existing personalized approaches do not involve a well-designed pre-training or self-supervised learning (SSL) process, merely utilizing the powerful learning ability of Transformer-like architectures. Recently, some researchers focused on designing pre-training objectives for personalized search (Zhou et al, 2021b) or session search (Zhu et al, 2021). Their work have shown the great potential of applying contrastive learning in encoding user search history and the content.…”
Section: Personalized Searchmentioning
confidence: 99%