2023
DOI: 10.1016/j.engappai.2023.106137
|View full text |Cite
|
Sign up to set email alerts
|

Improving question answering performance using knowledge distillation and active learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…It has been demonstrated to be effective for activities like object and voice recognition. Numerous previously trained models were utilized as the initial teaching emulate in one research, and their experience was then transmitted to a number of subordinate QA models, improving the QA performance [129]. A further investigation employed supervised transfer learning to improve the accuracy of the item details, and it saw an improvement of roughly 10% [130].…”
Section: Critical Analysismentioning
confidence: 99%
“…It has been demonstrated to be effective for activities like object and voice recognition. Numerous previously trained models were utilized as the initial teaching emulate in one research, and their experience was then transmitted to a number of subordinate QA models, improving the QA performance [129]. A further investigation employed supervised transfer learning to improve the accuracy of the item details, and it saw an improvement of roughly 10% [130].…”
Section: Critical Analysismentioning
confidence: 99%