2019
DOI: 10.1007/s00521-019-04080-5
|View full text |Cite
|
Sign up to set email alerts
|

Learning vector quantization and relevances in complex coefficient space

Abstract: In this contribution, we consider the classification of time series and similar functional data which can be represented in complex Fourier and wavelet coefficient space. We apply versions of learning vector quantization (LVQ) which are suitable for complex-valued data, based on the so-called Wirtinger calculus. It allows for the formulation of gradient-based update rules in the framework of cost-function-based generalized matrix relevance LVQ (GMLVQ). Alternatively, we consider the concatenation of real and i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…For the complex-valued embedding (so far) a quite limited number of machine learning algorithm is available, like the complex-valued support vector machine (cSVM) [41,3], the complex-valued generalized learning vector quantization (cGMLVQ) [4,37], or a complex-valued neural network (cNN) [13,38,7]. Further, a nearest neighbor (NN) classifier by employing a standard norm operator can be used.…”
Section: Algorithm 1 Approximated Embedding Of Symmetric Proximitiesmentioning
confidence: 99%
“…For the complex-valued embedding (so far) a quite limited number of machine learning algorithm is available, like the complex-valued support vector machine (cSVM) [41,3], the complex-valued generalized learning vector quantization (cGMLVQ) [4,37], or a complex-valued neural network (cNN) [13,38,7]. Further, a nearest neighbor (NN) classifier by employing a standard norm operator can be used.…”
Section: Algorithm 1 Approximated Embedding Of Symmetric Proximitiesmentioning
confidence: 99%
“…with Ω a linear projection matrix. This matrix can be learned as outlined in [9]. In our application, this matrix links the multiple perspectives of the time series to each other and permits an interpretable importance weighting of the single perspectives and the relevance of individual time series within a perspective 1 .…”
Section: Multi-perspective Embedding Of Non-psd Proximitiesmentioning
confidence: 99%
“…On own previous work on complex-valued embeddings in [8] and using ideas from [2], this can be done at low costs with moderate approximations. The recombined complex-valued data can be processed by an effective classifier model for complex-valued data suggested in [9].…”
Section: Introductionmentioning
confidence: 99%
“…Following [14,54], the derivatives according to (5) and (6) are obtained by the Wirtinger calculus (see ''Appendix 6'') applying the conjugate derivatives as…”
Section: Complex Variants Of Glvqmentioning
confidence: 99%