ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8683056
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Complex Time-series Representations for Riemannian Machine Learning of Radar Data

Abstract: Classification of radar observations with machine learning tools is of primary importance for the identification of noncooperative radar targets such as drones. These observations are made of complex-valued time series which possess a strong underlying structure. These signals can be processed through a time-frequency analysis, through their selfcorrelation (or covariance) matrices or directly as the raw signal. All representations are linked but distinct and it is known that the input representation is critic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…Recent research endeavors have shifted towards the development of foundational components of neural networks within the covariance matrix space. This includes techniques for feature transformation, such as mapping Euclidean features to covariance matrices using geodesic Gaussian kernels [40], non-linear operations applied to the eigenvalues of covariance matrices [41], convolutional operations employing SPD filters [42], and the Frechét mean [43]. Furthermore, proposals for Riemannian recurrent networks [44] and Riemannian batch normalization [45] have been put forth.…”
Section: Riemannian Manifold Of Symmetric Positive-definite Matricesmentioning
confidence: 99%
“…Recent research endeavors have shifted towards the development of foundational components of neural networks within the covariance matrix space. This includes techniques for feature transformation, such as mapping Euclidean features to covariance matrices using geodesic Gaussian kernels [40], non-linear operations applied to the eigenvalues of covariance matrices [41], convolutional operations employing SPD filters [42], and the Frechét mean [43]. Furthermore, proposals for Riemannian recurrent networks [44] and Riemannian batch normalization [45] have been put forth.…”
Section: Riemannian Manifold Of Symmetric Positive-definite Matricesmentioning
confidence: 99%
“…Previous work has proposed alternatives to the basic neural building blocks respecting the geometry of the space. For example, transformation layers [29,33,40], alternate convolutional layers based on SPDs [94] and Riemannian means [23], or appended after the convolution [21], recurrent models [24], projections onto Euclidean spaces [50,57] and batch normalization [20]. Our work follows this line, providing explicit formulas for translating Euclidean arithmetic notions into SPDs.…”
Section: Related Workmentioning
confidence: 99%
“…The main difficulties in training SPDnet lie both in backpropagation through the structured Riemannian functions and in manifold-constrained optimization. For more details, please refer to [ 12 , 21 ].…”
Section: Spectral-based Spd Matrix For Signal Detection With a Deementioning
confidence: 99%
“…In recent years, the manifold of SPD (symmetric positive definite) matrices has attracted much attention due to its powerful statistical representations. In medical imaging, such manifolds are used in diffusion tensor magnetic resonance imaging [ 5 ], and in the computer vision community, they are widely used in face recognition [ 6 , 7 ], object classification [ 8 ], transfer learning [ 9 ], action recognition in videos [ 10 , 11 ], radar signal processing [ 12 , 13 , 14 ] and sonar signal processing [ 15 ]. The powerful statistical representations of SPD matrices lie in the fact that they inherently belong to the curved Riemannian manifold.…”
Section: Introductionmentioning
confidence: 99%