2016
DOI: 10.1007/978-3-319-44254-9_5
|View full text |Cite
|
Sign up to set email alerts
|

Solution to 0–1 Knapsack Problem Using Cohort Intelligence Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…For ACO (Ant Colony Optimization), SA(Simulated Annealing, GA(Genetic Algorithm), NGHS, SSO(Social-Spider Optimization) algorithm results are reported from paper (Zhao & Zhou, 2018). For CI (Cohort Intelligence), results are reported from (Kulkarni et al, 2017). For CWDO (complex-valued encoding wind-driven optimization) the results are considered from (Zhou et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For ACO (Ant Colony Optimization), SA(Simulated Annealing, GA(Genetic Algorithm), NGHS, SSO(Social-Spider Optimization) algorithm results are reported from paper (Zhao & Zhou, 2018). For CI (Cohort Intelligence), results are reported from (Kulkarni et al, 2017). For CWDO (complex-valued encoding wind-driven optimization) the results are considered from (Zhou et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…Greedy strategy based self-adaption ant colony algorithm is introduced for the 0-1 knapsack problem (Du & Zu, 2015). In addition, many algorithms have been prospered for solving 0-1 KP such as Cognitive discrete gravitational search algorithm(CDGSA) (Razavi & Sajedi, 2015), wind driven Optimization(WDO) (Zhou et al, 2017), greedy degree and expectation efficiency (Lv et al, 2016), improved monkey algorithm (IMA) (Zhou et al, 2016a), monogamous pairs genetic algorithm (MPGA) (Lim et al, 2016), hybrid greedy and particle swarm (GPSO) (Nguyen, Wang & Truong, 2016), Quantum inspired social evolution (QSE) algorithm (Pavithr, 2016), binary particle swarm optimization based on the surrogate information with proportional acceleration coefficients (Lin et al, 2016), complex-valued encoding bat algorithm (Zhou et al, 2016b), cohort intelligence (CI) algorithm (Kulkarni et al, 2017), Migrating birds optimization (MBO) algorithm (Ulker & Tongur, 2017), binary flower pollination algorithm (BFPA) (Abdel-Basset et al, 2018a), binary bat algorithm (BBA) (Rizk-Allah et al, 2018), Social-Spider Optimization(SSO) Algorithm Nguyen et al, 2017), binary monarch butterfly optimization(BMBO) (Feng et al, 2016a), Binary Dragonfly Algorithm(BDA) (Abdel-Basset at al., 2017), Binary Fisherman Search (BFS) algorithm (Cobos et al, 2016),elite opposition-flower pollination algorithm (EOFPA) (Abdel-Basset et al, 2018b), Opposition-based learning monarch butterfly optimization with Gaussian perturbation(OLMBO) (Feng et al, 2017). In respect of the importance of knapsack problem in practical applications, developing new algorithms to solve large-scale types of knapsack problem applications undoubtedly becomes a true challenge.…”
Section: Introductionmentioning
confidence: 99%
“…Table 2 demonstrates the dimension, capacity, weight, and profit vectors for the 0-1 knapsack. The low-dimensional instances L1 to L10 were extracted from Burkardt (n.d.) and Martello and Toth (1990), and L11 to L15 from Kulkarni et al (2017), respectively.…”
Section: -1 Knapsack Datasetmentioning
confidence: 99%
“…2. To evaluate its performance, fifteen 0-1 knapsack dataset instances from Burkardt (n.d.), Martello and Toth (1990), and Kulkarni et al (2017) were assigned. In addition, to evaluate our study against a real dataset which contain 500 applications of ransomware and 10,153 benign applications.…”
mentioning
confidence: 99%
“…Furthermore, the MCA-CC algorithm has superiority over the basic MCA algorithm in finding a more quality solution with lower number of iterations in most of the high dimensional datasets. Moreover, the error rate obtained by the proposed algorithm (MCA-CC) was compared with different algorithms (Simulated Annealing (SA), Greedy Search Algorithm (GSA), Dynamic Programming (DP), Branch and Bound (BB) [17], and Cohort Intelligence (CI) [18]) on low dimension test datasets. This comparison is illustrated in Table 6.…”
mentioning
confidence: 99%