2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP) 2014
DOI: 10.1109/mlsp.2014.6958851
|View full text |Cite
|
Sign up to set email alerts
|

Bound constrained weighted NMF for industrial source apportionment

Abstract: In our recent work, we introduced a constrained weighted Non-negative Matrix Factorization (NMF) method using a βdivergence cost function. We assumed that some components of the factorization were known and were used to inform our NMF algorithm. In this paper, we are provided some intervals of possible values for some factorization components. We thus introduce an extended version of our previous work combining an improved divergence expression and some matrix normalizations while using the known / bounded inf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 14 publications
0
9
0
Order By: Relevance
“…where the superscript k is the current iteration number and θ j is the j-th element of the free parameter vector θ introduced in Equation 20. Equation 34together with expressions (36) and (37) yield the following auxiliary function:…”
Section: Weighted αβ-Nmf With Set Constraintsmentioning
confidence: 99%
See 1 more Smart Citation
“…where the superscript k is the current iteration number and θ j is the j-th element of the free parameter vector θ introduced in Equation 20. Equation 34together with expressions (36) and (37) yield the following auxiliary function:…”
Section: Weighted αβ-Nmf With Set Constraintsmentioning
confidence: 99%
“…In this paper, we thus extend our previous work [32] by (i) investigating and discussing several αβ-divergence expressions, (ii) exploring different data normalization procedures combined with set values (as profiles are chemical species proportions, the rows of F are normalized), and (iii) adding minimum and maximum bounds to some of the unknown values of F. The methods we propose in this paper have been partially introduced in [35,36], in the framework of the β-divergence only. We generalize here [35,36] to the αβ-divergence and we provide a detailed study of their performance, shown on both realistic simulations and real data campaign. The remainder of the paper is structured as follows.…”
Section: Introductionmentioning
confidence: 96%
“…It should be noticed that normalizing the rows of F is classical in matrix factorization and is usually performed after each update of F [42] or within the optimization problem [52]. Informed NMF with normalization was also investigated in [53], [54]. However, such a normalization implies rows of F to be exactly equal to the chosen means, which may lead to calibration errors in the case of an inexact knowledge of the mean parameters in the considered application.…”
Section: Average-constrained Extension Of In-calmentioning
confidence: 99%
“…Moreover, blind NMF methods for source apportionment may result in solutions without physical meaning [11], especially when some source profiles are geometrically close to each other. To overcome these drawbacks, we recently proposed some informed NMF approaches [12][13][14][15] which take into account some prior information-e.g., some set values in the profile matrix provided by chemical experts-together with the row sum-to-one property. In [12,13], the constraints were alternatingly satisfied along iterations and only the limit matrix was fulfilling both of them.…”
Section: Introductionmentioning
confidence: 99%
“…To overcome these drawbacks, we recently proposed some informed NMF approaches [12][13][14][15] which take into account some prior information-e.g., some set values in the profile matrix provided by chemical experts-together with the row sum-to-one property. In [12,13], the constraints were alternatingly satisfied along iterations and only the limit matrix was fulfilling both of them. We recently proposed a new parameterization [14,15] which fulfills both constraints at the same time, and we then derived informed NMF methods using the split gradient strategy [16].…”
Section: Introductionmentioning
confidence: 99%