2020
DOI: 10.3390/electronics9111880
|View full text |Cite
|
Sign up to set email alerts
|

Evolving Deep DenseBlock Architecture Ensembles for Image Classification

Abstract: Automatic deep architecture generation is a challenging task, owing to the large number of controlling parameters inherent in the construction of deep networks. The combination of these parameters leads to the creation of large, complex search spaces that are feasibly impossible to properly navigate without a huge amount of resources for parallelisation. To deal with such challenges, in this research we propose a Swarm Optimised DenseBlock Architecture Ensemble (SODBAE) method, a joint optimisation and trainin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 30 publications
0
7
0
Order By: Relevance
“…Furthermore, the aforementioned A-BiLSTM architecture implemented in this research was shown to be highly effective, but with further experimentation with different layer and hyperparameter settings [24][25][26][27][28][29][30][31][32], additional improvements in performance could be made. Evolutionary algorithms [33][34][35][36][37][38][39][40][41][42][43][44][45][46][47][48][49] could also be exploited pertaining to the above parameter tuning as well as architecture generation processes. Moreover, it would also be beneficial to employ additional medical audio datasets to further evaluate model efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, the aforementioned A-BiLSTM architecture implemented in this research was shown to be highly effective, but with further experimentation with different layer and hyperparameter settings [24][25][26][27][28][29][30][31][32], additional improvements in performance could be made. Evolutionary algorithms [33][34][35][36][37][38][39][40][41][42][43][44][45][46][47][48][49] could also be exploited pertaining to the above parameter tuning as well as architecture generation processes. Moreover, it would also be beneficial to employ additional medical audio datasets to further evaluate model efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…Fielding and Zhang [21] develop a PSO-based algorithm for image classification to optimize DenseBlock CNN architecture. The proposed method employs adaptive acceleration coefficients generated using cosine annealing learning rate schedule to overcome local optima traps.…”
Section: ) Control Of the Parametersmentioning
confidence: 99%
“…As a result, many PSO variants have been proposed to tackle the challenges. As an example, Fielding and Zhang [49] proposed a Swarm Optimised DenseBlock Architecture Ensemble (SODBAE) integrated with a PSO variant for image classification. The model was capable of devising CNN architectures with residual connections and dense connectivity to increase network diversity.…”
Section: Variants Of Particle Swarm Optimisationmentioning
confidence: 99%