2019
DOI: 10.48550/arxiv.1905.02851
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FAQ Retrieval using Query-Question Similarity and BERT-Based Query-Answer Relevance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Transfer learning via large pre-trained Transformers [46]-the prominent case being BERT [7]-has lead to remarkable empirical successes on a range of NLP problems. The BERT approach to learn textual representations has also significantly improved the performance of neural models for several IR tasks [55,54,37,33,57], that for a long time struggled to outperform classic IR models [53]. In this work we use the no-CL BERT as a strong baseline for the conversation response ranking task.…”
Section: Related Workmentioning
confidence: 99%
“…Transfer learning via large pre-trained Transformers [46]-the prominent case being BERT [7]-has lead to remarkable empirical successes on a range of NLP problems. The BERT approach to learn textual representations has also significantly improved the performance of neural models for several IR tasks [55,54,37,33,57], that for a long time struggled to outperform classic IR models [53]. In this work we use the no-CL BERT as a strong baseline for the conversation response ranking task.…”
Section: Related Workmentioning
confidence: 99%