2021
DOI: 10.1007/s42979-020-00427-1
|View full text |Cite
|
Sign up to set email alerts
|

Sentence Embedding Models for Similarity Detection of Software Requirements

Abstract: Semantic similarity detection mainly relies on the availability of laboriously curated ontologies, as well as of supervised and unsupervised neural embedding models. In this paper, we present two domain-specific sentence embedding models trained on a natural language requirements dataset in order to derive sentence embeddings specific to the software requirements engineering domain. We use cosine-similarity measures in both these models. The result of the experimental evaluation confirm that the proposed model… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(16 citation statements)
references
References 26 publications
0
16
0
Order By: Relevance
“…• Requirements Traceability: Requirements traceability is used to capture the relationships between the requirements, the design, and the implementation of a system [55]. We recognized 21 papers related to this task, part of which focused on detecting the relationships between requirements (inter-dependency between requirements) [56]- [66], while the remaining focused on detecting the relationships between requirements and other artifacts (design documents and source code) [55], [67]- [74]. • Requirements Clustering: Requirements clustering is used to organize software requirements into a set of clusters with high cohesion and low coupling [75].…”
Section: ) Requirements Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…• Requirements Traceability: Requirements traceability is used to capture the relationships between the requirements, the design, and the implementation of a system [55]. We recognized 21 papers related to this task, part of which focused on detecting the relationships between requirements (inter-dependency between requirements) [56]- [66], while the remaining focused on detecting the relationships between requirements and other artifacts (design documents and source code) [55], [67]- [74]. • Requirements Clustering: Requirements clustering is used to organize software requirements into a set of clusters with high cohesion and low coupling [75].…”
Section: ) Requirements Analysismentioning
confidence: 99%
“…This part represents the second largest group of papers (24 papers) with a major increase in the last couple of years. It is noted that papers that use this kind of representations achieve promising results in many major tasks such as requirement classification [11], [50], [51], traceability [56], [58], [59], [64], ambiguity detection [106], and requirement extraction [91], [98].…”
Section: ) Advanced Embedding Representationsmentioning
confidence: 99%
“…• Requirements Traceability: Requirements traceability is used to capture the relationships between the requirements, the design, and the implementation of a system [55]. We recognized 21 papers related to this task, part of which focused on detecting the relationships between requirements (inter-dependency between requirements) [56,57,58,59,60,61,62,63,64,65,66], while the remaining focused on detecting the relationships between requirements and other artifacts (design documents and source code) [67,55,68,69,70,71,72,73,74].…”
Section: Requirements Analysismentioning
confidence: 99%
“…This part represents the second largest group of papers (24 papers) with a major increase in the last couple of years. It is noted that papers that use this kind of representations achieve promising results in many major tasks such as requirement classification [11,50,51], traceability [64,56,59,58], ambiguity detection [106], and requirement extraction [98,91].…”
Section: Advanced Embedding Representationsmentioning
confidence: 99%
“…BERT is an open-source machine learning framework for natural language processing (NLP) designed to help computers understand language meaning in the text [16]. BERT is a Bidirectional Encoder Representations based on Transformers, using a deep learning model as shown in Figure 6 [17].…”
Section: Bidirectional Encoder Representation From Transformer (Bert) -Machine Learningmentioning
confidence: 99%