2020 IEEE 28th International Requirements Engineering Conference (RE) 2020
DOI: 10.1109/re48521.2020.00028
|View full text |Cite
|
Sign up to set email alerts
|

NoRBERT: Transfer Learning for Requirements Classification

Abstract: Classifying requirements is crucial for automatically handling natural language requirements. The performance of existing automatic classification approaches diminishes when applied to unseen projects because requirements usually vary in wording and style. The main problem is poor generalization. We propose NoRBERT that fine-tunes BERT, a language model that has proven useful for transfer learning. We apply our approach to different tasks in the domain of requirements classification. We achieve similar or bett… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
99
2

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 90 publications
(104 citation statements)
references
References 32 publications
3
99
2
Order By: Relevance
“…Despite the abundance of such studies involving SRS documents, which showcase the better performance of deep learning architectures over the traditional classifiers, only a few studies consider transfer learning-based approaches in the software requirement domain. Recently, Hey, Keim, Koziolek, and Tichy [25] studied the problem of classifying functional and nonfunctional SRS documents using transfer learning-based approaches such as NoRBERT. They concluded that NoRBERT improves the prediction performance considerably.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Despite the abundance of such studies involving SRS documents, which showcase the better performance of deep learning architectures over the traditional classifiers, only a few studies consider transfer learning-based approaches in the software requirement domain. Recently, Hey, Keim, Koziolek, and Tichy [25] studied the problem of classifying functional and nonfunctional SRS documents using transfer learning-based approaches such as NoRBERT. They concluded that NoRBERT improves the prediction performance considerably.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In [20], the authors experimented with different classifications: Classifying SRs into FRs or NFRs, classifying NFRs into different categories (usability, security, operational, and performance), and classifying FRs into the categories of functions, data, and behavior. The model was a tuned (BERT) model named "NoRBERT".…”
Section: Complete System (Classifying Nfrs and Frs Into Multi-classes)mentioning
confidence: 99%
“…Finally, padding was used to unify the length of the sentences, since they had different numbers of words. This was carried out by finding the maximum length of the sentence and then adding zeros to the end of the sequence of tokens for any sentence that was shorter than the maximum length specified according to the input [20]. Algorithm 1 summarizes the preprocessing and Figure 3 gives an example of preprocessing of one SR from the dataset.…”
Section: Data Preprocessingmentioning
confidence: 99%
See 2 more Smart Citations