Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) 2018
DOI: 10.18653/v1/p18-2034
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Context-Aware Question Matching in Information-seeking Conversations in E-commerce

Abstract: Building multi-turn information-seeking conversation systems is an important and challenging research topic. Although several advanced neural text matching models have been proposed for this task, they are generally not efficient for industrial applications. Furthermore, they rely on a large amount of labeled data, which may not be available in real-world applications. To alleviate these problems, we study transfer learning for multi-turn information seeking conversations in this paper. We first propose an eff… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 24 publications
1
9
0
Order By: Relevance
“…T1, T2 and T3 outperform baseline, hence enforcing the effectiveness of adversarial domain adaptation in all tasks in Table 1. T3 outperforms T2, thus indicating that learning a combination of domain specific and shared representations is quite beneficial for all domain transfer experiments in Table 1. This observation was also noted by Qiu et al (Qiu et al, 2018), even if without the use of gradient reversal.…”
Section: Resultssupporting
confidence: 78%
See 1 more Smart Citation
“…T1, T2 and T3 outperform baseline, hence enforcing the effectiveness of adversarial domain adaptation in all tasks in Table 1. T3 outperforms T2, thus indicating that learning a combination of domain specific and shared representations is quite beneficial for all domain transfer experiments in Table 1. This observation was also noted by Qiu et al (Qiu et al, 2018), even if without the use of gradient reversal.…”
Section: Resultssupporting
confidence: 78%
“…To address this issue, Qiu et al used both shared domain invariant and domain specific features: while the shared features are learned by maximizing domain discriminator loss, the domain specific features are learned by jointly minimizing the task loss and the domain classification loss by domain specific discriminators (Qiu et al, 2018). Similar ideas were put forth by Peng et al for cross-domain sentiment classification where they demonstrate the effectiveness of using both domain specific and domain invariant features (Peng et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…Note that some of the works discussed here transfer knowledge from external datasets into the QA task they address (Chung et al, 2017;Pan et al, 2019;Min et al, 2017;Qiu et al, 2018;Chen et al, 2017). In this work, we focus solely on the resources provided in the task itself because such compatible external resources may not be available in real-world applications of QA.…”
Section: Related Workmentioning
confidence: 99%
“…Following Qiu et al [4], we evaluate the model performance on five automatic evaluation metrics: MRR, R 10 @1, R 10 @2, R 10 @5, and R 2 @1. R n @k calculates the recall of the true positive predefined questions among the k selected candidates from n available candidates.…”
Section: Discussionmentioning
confidence: 99%