2011
DOI: 10.1109/taes.2011.5751261
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge-Aided Space-Time Adaptive Processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
42
0
2

Year Published

2011
2011
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 67 publications
(44 citation statements)
references
References 22 publications
0
42
0
2
Order By: Relevance
“…Here we propose a more straightforward method than that in [21] and [22] to solve (19). Let (20) and note that…”
Section: Robust Fmlacc Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…Here we propose a more straightforward method than that in [21] and [22] to solve (19). Let (20) and note that…”
Section: Robust Fmlacc Algorithmmentioning
confidence: 99%
“…Two cases are considered, where jamming is Table 1 Summary of the fast implementation of the selection of ɛ and robust FMLACC algorithm Given K i.i.d. secondary samples z 1 , …, z K and a priori knowledge R priori , find the optimal e and the corresponding MLE for (19).…”
Section: Numerical Simulationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Besides, an elevation robust capon beamforming (ERCB) STAP algorithm is presented in [13], but it is prone to be affected by diagonal loading (DL) level. Additionally, if the prior knowledge on clutter dependence is available, the STAP performance can be improved using knowledge-aided STAP methods [14][15][16].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, knowledge-aided STAP (KA-STAP) has received much attention as it has been found that STAP detection can significantly be improved by exploiting prior knowledge of the disturbance signal [27]- [32]. A natural way to incorporate prior knowledge in solving the detection problem is a Bayesian approach that models the disturbance covariance matrix as a random matrix with some prior [33]- [38].…”
Section: Introductionmentioning
confidence: 99%