2019 1st International Conference on Advances in Science, Engineering and Robotics Technology (ICASERT) 2019
DOI: 10.1109/icasert.2019.8934499
|View full text |Cite
|
Sign up to set email alerts
|

Non-Functional Requirements Classification with Feature Extraction and Machine Learning: An Empirical Study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(15 citation statements)
references
References 8 publications
0
15
0
Order By: Relevance
“…In [17], the PROMISE software engineering repository dataset was used to compare the performance of five ML algorithms (multinomial naïve Bayes (MNB), Gaussian naïve Bayes (GNB), Bernoulli naïve Bayes (BNB), K-nearest neighbor (KNN), support vector machine (SVM), stochastic gradient descent SVM (SGD SVM), and decision tree (Dtree)), along with different feature extraction techniques. Eleven categories were used to label NFRs into the categories of availability, legal, look and feel, maintainability, operational, performance, scalability, security, usability, fault tolerance, and portability.…”
Section: Classifying Nfrs Into Multi-classesmentioning
confidence: 99%
See 1 more Smart Citation
“…In [17], the PROMISE software engineering repository dataset was used to compare the performance of five ML algorithms (multinomial naïve Bayes (MNB), Gaussian naïve Bayes (GNB), Bernoulli naïve Bayes (BNB), K-nearest neighbor (KNN), support vector machine (SVM), stochastic gradient descent SVM (SGD SVM), and decision tree (Dtree)), along with different feature extraction techniques. Eleven categories were used to label NFRs into the categories of availability, legal, look and feel, maintainability, operational, performance, scalability, security, usability, fault tolerance, and portability.…”
Section: Classifying Nfrs Into Multi-classesmentioning
confidence: 99%
“…Then, NFRs were classified using a smaller number of classes, as used in [20], achieving a 0.91 F1 score, but NFRs were also classified into security-related or non-security-related, with a 0.77 F1 score. In [17], the authors classified NFRs into 11 classes, scoring a 76% accuracy. Only [20] provided a complete classification system.…”
Section: Comparative Analysismentioning
confidence: 99%
“…POS is used for tagging, and TF*IDF is used for extraction requirements. Haque et al [9] combines feature extraction with machine learning to classify Non Functional Requirements (NFR). Testing was carried out using seven machine learning algorithms with four feature selection approaches to find the best pair for performing feature extraction and classification.…”
Section: Previous Researchesmentioning
confidence: 99%
“…In pursuance of the best evaluation of a machine learning model's performance, one can make use of the accuracy, precision, recall and F1 Score metrics -all of which make use of the number of true and false positives and negatives yielded at the end of any training method. (Haque, Rahman and Siddik, 2019). Other metrics such as negative predictive value and true negative rate can also be analyzed.…”
Section: Artificial Neural Network Designmentioning
confidence: 99%