2021
DOI: 10.15622/ia.2021.3.5
|View full text |Cite
|
Sign up to set email alerts
|

Efficient natural language classification algorithm for detecting duplicate unsupervised features

Abstract: This paper focuses on capturing the meaning of Natural Language Understanding (NLU) text features to detect the duplicate unsupervised features. The NLU features are compared with lexical approaches to prove the suitable classification technique. The transfer-learning approach is utilized to train the extraction of features on the Semantic Textual Similarity (STS) task. All features are evaluated with two types of datasets that belong to Bosch bug and Wikipedia article reports. This study aims to structure the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…For the first time, shallow and deep fusion methods are proposed to integrate external recurrent neural networkbased language models into the encoding-decoding framework. In this approach, the shallow fusion approach linearly combines translation probabilities and language model probabilities, while the deep fusion approach connects recurrent neural network-based language models with decoders to form a new tightly coupled network [12]. Although high-quality and domain-specific translations are crucial in the real world, domain-specific corpora are often scarce or non-existent, leading to poor performance of neural machine translation models in such cases.…”
Section: Related Workmentioning
confidence: 99%
“…For the first time, shallow and deep fusion methods are proposed to integrate external recurrent neural networkbased language models into the encoding-decoding framework. In this approach, the shallow fusion approach linearly combines translation probabilities and language model probabilities, while the deep fusion approach connects recurrent neural network-based language models with decoders to form a new tightly coupled network [12]. Although high-quality and domain-specific translations are crucial in the real world, domain-specific corpora are often scarce or non-existent, leading to poor performance of neural machine translation models in such cases.…”
Section: Related Workmentioning
confidence: 99%