2012
DOI: 10.1016/j.jss.2012.01.017
|View full text |Cite
|
Sign up to set email alerts
|

Thresholds for error probability measures of business process models

Abstract: The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for busi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
58
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 71 publications
(58 citation statements)
references
References 51 publications
0
58
0
Order By: Relevance
“…First, we selected appropriate complexity metrics based on the set of metrics defined by Mendling (2008). The most important classes of metrics apply to the size of the models (most notably, arcs and nodes), and the connections within the models (most notably average connector degree) (Mendling et al, 2012a). Next, we defined three levels of complexity, viz., low, average and high.…”
Section: Materials and Proceduresmentioning
confidence: 99%
“…First, we selected appropriate complexity metrics based on the set of metrics defined by Mendling (2008). The most important classes of metrics apply to the size of the models (most notably, arcs and nodes), and the connections within the models (most notably average connector degree) (Mendling et al, 2012a). Next, we defined three levels of complexity, viz., low, average and high.…”
Section: Materials and Proceduresmentioning
confidence: 99%
“…understandability) of the process models. For example, several authors agreed that an increase in size of a model appears to have a negative impact on its pragmatic quality [6,[39][40][41][42][43]. Some considerations are available on when a process model would have to be split up into subprocesses to decrease its size.…”
Section: Results Of the Data Extractionmentioning
confidence: 99%
“…It has been recommended based on empirical findings that process models with more than 50 elements should be decomposed [6]. Another study proposes to decompose the model once it has more than 31 elements [43] based on a threshold definition. Depending on the process modeling language the amount of activities can vary for the same amount of elements [44].…”
Section: Results Of the Data Extractionmentioning
confidence: 99%
“…The work reported in [42] uses logistic regression and error probability as a dependent variable. Logistic regression is a statistical model for estimating the probability of binary choices (error or no error in this case) [43].…”
Section: Structural Factors Of Process Model Understandingmentioning
confidence: 99%
“…The best threshold can then be found based on sensitivity and specificity values with: sensitivity = true positives(TP) rate = T P P , specif icity = 1 − false positives(FP) rate = 1−F P P . Using this approach, several guidelines of the 7PMG could be refined in [42]. Table 1 provides an overview of the results showing, among others, that process models with more than 30 nodes should be decomposed.…”
Section: Structural Factors Of Process Model Understandingmentioning
confidence: 99%