2021
DOI: 10.1016/j.asoc.2021.107446
|View full text |Cite
|
Sign up to set email alerts
|

Deep Convolutional Neural Network for musical genre classification via new Self Adaptive Sea Lion Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 22 publications
(5 citation statements)
references
References 36 publications
0
5
0
Order By: Relevance
“…The dataset for analysis was downloaded from Reference 43. Accordingly, the performance of the adopted approach was measured over extant models such as ensemble classifiers + WOA, 44 ensemble classifiers + SLnO, 38 ensemble classifiers + SA‐SLnO, 45 and ensemble classifiers + IMFO 34 regarding certain positive and negative measures. Here, the performance analysis was performed for varied learning percentage (LP) that ranges from 50, 60, 70, and 80.…”
Section: Resultsmentioning
confidence: 99%
“…The dataset for analysis was downloaded from Reference 43. Accordingly, the performance of the adopted approach was measured over extant models such as ensemble classifiers + WOA, 44 ensemble classifiers + SLnO, 38 ensemble classifiers + SA‐SLnO, 45 and ensemble classifiers + IMFO 34 regarding certain positive and negative measures. Here, the performance analysis was performed for varied learning percentage (LP) that ranges from 50, 60, 70, and 80.…”
Section: Resultsmentioning
confidence: 99%
“…BAG demonstrates higher accuracy in classifying music genres compared to the other four classifiers. The paper introduces a technique for categorizing music genres using a Convolutional Neural Network (CNN) and the Self Adaptive Sea Lion Optimization (SA-SLnO) algorithm [ 20 ]. This approach involves first extracting the non-negative matrix factorization (NMF) features, pitch features, and STFT features from the signal.…”
Section: Literature Reviewmentioning
confidence: 99%
“…However, these approaches only optimize the weights of the neural networks. Kumaraswamy and Poonacha (2021) used the self-adaptive sea lion optimization algorithm to train a neural network that receives a set of features as input. Similarly, Kumaraswamy (2022) uses an improved moth flame optimization algorithm for weight learning of the neural network for the MGC task.…”
Section: Related Workmentioning
confidence: 99%