2020
DOI: 10.1109/tsp.2019.2955829
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Channel Factor Analysis With Common and Unique Factors

Abstract: This work presents a generalization of classical factor analysis (FA). Each of M channels carries measurements that share factors with all other channels, but also contains factors that are unique to the channel. Furthermore, each channel carries an additive noise whose covariance is diagonal, as is usual in factor analysis, but is otherwise unknown. This leads to a problem of multi-channel factor analysis with a specially structured covariance model consisting of shared low-rank components, unique low-rank co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 43 publications
0
5
0
Order By: Relevance
“…MM algorithms are becoming increasingly popular in signal/image processing [18], [36] and machine learning [27], [34], [38]. MM approaches are fast, stable, require limited manual settings, and are often preferred by practitioners in application domains such as medical imaging [16] and telecommunications [29]. The present work introduces novel theoretical convergence guarantees for MM algorithms when approximate gradient terms are employed, generalizing some recent work [11], [27] to a wider class of functions and algorithms.…”
mentioning
confidence: 99%
“…MM algorithms are becoming increasingly popular in signal/image processing [18], [36] and machine learning [27], [34], [38]. MM approaches are fast, stable, require limited manual settings, and are often preferred by practitioners in application domains such as medical imaging [16] and telecommunications [29]. The present work introduces novel theoretical convergence guarantees for MM algorithms when approximate gradient terms are employed, generalizing some recent work [11], [27] to a wider class of functions and algorithms.…”
mentioning
confidence: 99%
“…H and A T are the Toeplitz matrices constructed from h and a respectively. Extending (equation ( 14) in [33]), we have,…”
Section: Estimation Of Reverse Channel Parameter Amentioning
confidence: 99%
“…It is here, that a technique called "Factor Analysis" (used in machine learning for separating mixture of distributions [31], [32], and very recently in speech and signal processing [33]), comes to the rescue. The received signal (observation vector) zl can be viewed as filtering white n r through the unknown H factor, (along with other factors) in the factor analysis model (32).…”
Section: Estimation Of Reverse Channel Parameter Amentioning
confidence: 99%
See 1 more Smart Citation
“…Addressing this limitation, this study investigates the simultaneous correlation between source and load power in a microgrid and weather features, conducting research on the joint ultra-short-term prediction of source and load power in a microgrid. Additionally, commonly used dimensionality reduction algorithms include Principal Component Analysis (PCA) (Wang et al, 2023), Independent Component Analysis (ICA) (Kobayashi and Iwai, 2018), Factor Analysis (FA) (Ramirez et al, 2019;Wu et al, 2024), etc. FA merges numerous features into several representative common factors to extract latent factors among features, accurately capturing the relevant information in the data (Zhou et al, 2020).…”
Section: Introductionmentioning
confidence: 99%