2023
DOI: 10.1115/1.4063764
|View full text |Cite
|
Sign up to set email alerts
|

Deep Neural Networks in Natural Language Processing for Classifying Requirements by Origin and Functionality: An Application of BERT in System Requirements

Jesse Mullis,
Cheng Chen,
Beshoy Morkos
et al.

Abstract: Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
references
References 32 publications
0
0
0
Order By: Relevance