2021
DOI: 10.1016/j.omega.2020.102370
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Multi-criteria divisive hierarchical clustering algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(15 citation statements)
references
References 66 publications
(64 reference statements)
0
14
0
1
Order By: Relevance
“…All parameters of the second group are transformed with one-hot encoding technique (or dummy variables are created) 8,9 .…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…All parameters of the second group are transformed with one-hot encoding technique (or dummy variables are created) 8,9 .…”
Section: Resultsmentioning
confidence: 99%
“…In the expression (1) x is a source value, x' is a transformed version of this parameter, µ is mean value of x and σ is its standard deviation 8,9 .…”
Section: Resultsmentioning
confidence: 99%
“…The method of splitting hierarchical clustering begins with a single cluster comprising all data items. Following that, with each iteration, it separates into clusters by meeting certain resemblance requirements, before every piece of information entity creates its cluster or the stopping criteria are reached [44,45]. • Expectation and maximization (EM) algorithm EM an unphased algorithm for estimating the maximum likelihood values of variables in unstructured statistical models When it runs the expectation stage, it will calculate variables that maximize the assumptions discovered during the E phase; therefore, it must perform a maximization step in order to search for new ones.…”
Section: Stepsmentioning
confidence: 99%
“…Two types of algorithm are often implemented when moving up the hierarchy. The divisive approach of clustering reckons all data as one cluster and performs splits, which is used in many research [8]. Nevertheless, the agglomerative hierarchical clustering is a bottom-up method with many variants [9].…”
Section: Related Workmentioning
confidence: 99%