Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval 2019
DOI: 10.1145/3331184.3331204
|View full text |Cite
|
Sign up to set email alerts
|

Domain Adaptation for Enterprise Email Search

Abstract: In the enterprise email search setting, the same search engine often powers multiple enterprises from various industries: technology, education, manufacturing, etc. However, using the same global ranking model across different enterprises may result in suboptimal search quality, due to the corpora differences and distinct information needs. On the other hand, training an individual ranking model for each enterprise may be infeasible, especially for smaller institutions with limited data. To address this data c… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 44 publications
0
9
0
Order By: Relevance
“…Recently, Yilmaz et al (2019) has shown that training models on general-domain corpora adapts well to new domains without targeted supervision. Another common technique for adaptation to specialized domains is to learn cross-domain representations (Cohen et al, 2018;Tran et al, 2019). Our work is more aligned with methods like Yilmaz et al (2019) which use general domain resources to build neural models for new domains, though via different techniques -data augmentation vs. model transfer.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, Yilmaz et al (2019) has shown that training models on general-domain corpora adapts well to new domains without targeted supervision. Another common technique for adaptation to specialized domains is to learn cross-domain representations (Cohen et al, 2018;Tran et al, 2019). Our work is more aligned with methods like Yilmaz et al (2019) which use general domain resources to build neural models for new domains, though via different techniques -data augmentation vs. model transfer.…”
Section: Related Workmentioning
confidence: 99%
“…Shen et al [27] clustered queries based on the frequent n-grams of results retrieved with a baseline ranker and then used the query cluster information as an auxiliary objective in multi-task learning. There has also been research that studies how to combine sparse and dense features in a unified model effectively [22], combine textual information from queries and documents with other side information to conduct effective and efficient learning to rank [24], and transfer models learned from one domain to another in email search [30].…”
Section: Related Workmentioning
confidence: 99%
“…Recently, deep neural networks (DNNs) have shown great success in learning-to-rank tasks. They significantly improve the performance of search engines in the presence of large-scale query logs in both web search [19] and email settings [39,45,51]. The advantages of DNNs over traditional models are mainly two-fold: (1) DNNs have strong power to learn embedded representations from sparse features, including words [33] and characters [6].…”
Section: Introductionmentioning
confidence: 99%
“…However, there have been few efforts that study how to effectively exploit both dense and sparse features in the learningto-rank setting, probably because a natural approach exists-simply concatenating dense features with embedded sparse features and feeding them into the DNNs. Indeed, many previous deep neural email search models use direct concatenation of dense features with embedded sparse features [15,38,39,45].…”
Section: Introductionmentioning
confidence: 99%