2020
DOI: 10.1007/s11042-020-09471-8
|View full text |Cite
|
Sign up to set email alerts
|

An empirical estimation for time and memory algorithm complexities: newly developed R package

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…We compared the time and space complexities of the models in case studies with the GuessCompx tool. The GuessCompx tool [56,57] empirically estimates the computational complexity of a function in terms of Big-O notations. It computes multiple samples of increasing sizes from the given dataset and estimates the best-fit complexity according to the "leave-one-out mean squared error (LOO-MSE)" approach.…”
Section: Discussionmentioning
confidence: 99%
“…We compared the time and space complexities of the models in case studies with the GuessCompx tool. The GuessCompx tool [56,57] empirically estimates the computational complexity of a function in terms of Big-O notations. It computes multiple samples of increasing sizes from the given dataset and estimates the best-fit complexity according to the "leave-one-out mean squared error (LOO-MSE)" approach.…”
Section: Discussionmentioning
confidence: 99%
“…ese algorithms just need mathematical operators in the update equation along with the best and the worst values. e complexity of JA has been examined empirically in terms of big-O notations using the GuessCompx tool [53,54]. e complexity is compared with the metaheuristic technique (GA) and both methods showed linear complexity (Paliwal et al [36]).…”
Section: Discussionmentioning
confidence: 99%
“…It follows that the method proposed in this paper is highly demanding from a computational perspective: for instance, the model used here has a memory complexity of 68,093 Mb, while with 12 indicators it would reach a memory complexity of 359,328 Mb. The complexity of the algorithm is estimated using the GuessCompx R package developed by Agenis-Nevers et al (2019). To some extent, the growth of complexity may be thought of as the price of having more general conclusions.…”
Section: The Issue Of Relative Weightsmentioning
confidence: 99%