2016
DOI: 10.1007/978-3-319-30298-0_18
|View full text |Cite
|
Sign up to set email alerts
|

Neural Networks and PCA for Spectrum Sensing in the Context of Cognitive Radio

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 6 publications
0
5
0
Order By: Relevance
“…Substituting ( 17) and ( 18) into Equation ( 16), the optimal separating hyperplane can be obtained by solving the following dual representation of the optimization problem: 𝑦 đť‘– = 0 , 𝛼 đť‘– > 0 By solving this dual Lagrange function (19), 𝛼 is evaluated. Consequently, đťś” is evaluated out from (17), and đť‘Ź can be easily calculated from (20):…”
Section: Linearly Separable Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Substituting ( 17) and ( 18) into Equation ( 16), the optimal separating hyperplane can be obtained by solving the following dual representation of the optimization problem: 𝑦 đť‘– = 0 , 𝛼 đť‘– > 0 By solving this dual Lagrange function (19), 𝛼 is evaluated. Consequently, đťś” is evaluated out from (17), and đť‘Ź can be easily calculated from (20):…”
Section: Linearly Separable Classificationmentioning
confidence: 99%
“…If the signal energy exceeds the threshold, we declare the presence of the PU, otherwise it is absent. During the last years soft computing techniques like artificial neural networks (ANN) and support vector machine (SVM), have become extremely successful discriminative approaches to pattern classification [14][15][16][17][18][19][20]. In our context, we propose an implementation of ANN and SVM for SS operation to detect the PU signal; we focus on different ANN training algorithms and SVM functions that can be applied on the set of input data patterns.…”
Section: Introductionmentioning
confidence: 99%
“…Several researchers have discussed different simulation algorithms [24][25][26][27] and have researched various artificial neural network tools, interpretation, and modeling .…”
Section: Literature Surveymentioning
confidence: 99%
“…The eigenvalues describe the amount of information recovered by the corresponding eigenvector and give the components in order of significance. In fact, it turns out that the eigenvector with the highest eigenvalue is the principle component of the data set [20]. The first principal component u 1 contains maximum information as its corresponding eigenvalue λ 1 is the maximum, the second principal component u 2 contains about the following maximum information as its corresponding eigenvalue λ 2 is the 2nd-maximum, etc.…”
Section: B Spectrum Reduction Dimension With Pcamentioning
confidence: 99%
“…. 10 is enough for high quality reconstruction of original spectral sets [20]. And the lossy information defined as reconstruction error J e is:…”
Section: B Spectrum Reduction Dimension With Pcamentioning
confidence: 99%