2009
DOI: 10.7153/oam-03-02
|View full text |Cite
|
Sign up to set email alerts
|

On the rate of convergence of the image space reconstruction algorithm

Abstract: Abstract. The Image Space Reconstruction Algorithm (ISRA) of Daube-Witherspoon and Muehllehner is a multiplicative algorithm for solving nonnegative least squares problems. Eggermont has proved the global convergence of this algorithm. In this paper, we analyze its rate of convergence. We show that if at the minimum the strict complementarity condition is satisfied and the reduced Hessian matrix is positive definite, then the ISRA algorithm which converges to it does so at a linear rate of convergence. If, how… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 22 publications
0
6
0
Order By: Relevance
“…In particular, the MATLAB R Statistics Toolbox implements this method. However, MU have been observed to converge relatively slowly, especially when dealing with dense matrices M , see Han et al (2009); Gillis & Glineur (2008) and the references therein, and many other algorithms have been subsequently introduced which perform better in most situations. For example, Cichocki et al (2007); and, independently, several other authors (Ho, 2008;Gillis & Glineur, 2008;Li & Zhang, 2009) proposed a technique called hierarchical alternating least squares (HALS) 1 , which successively updates each column of W with an optimal and easy to compute closed-form solution.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, the MATLAB R Statistics Toolbox implements this method. However, MU have been observed to converge relatively slowly, especially when dealing with dense matrices M , see Han et al (2009); Gillis & Glineur (2008) and the references therein, and many other algorithms have been subsequently introduced which perform better in most situations. For example, Cichocki et al (2007); and, independently, several other authors (Ho, 2008;Gillis & Glineur, 2008;Li & Zhang, 2009) proposed a technique called hierarchical alternating least squares (HALS) 1 , which successively updates each column of W with an optimal and easy to compute closed-form solution.…”
Section: Introductionmentioning
confidence: 99%
“…The MU became extremely popular mainly because (i) they are simple to implement 3 , (ii) they scale well and are applicable to sparse matrices 4 , and (iii) they were proposed in the paper of Lee and Seung [79] which launched the research on NMF. However, the MU converge relatively slowly; see, e.g., [62] for a theoretical analysis, and Section 3.1.6 for some numerical experiments. Note that the original MU only update W once before updating H. They can be significantly accelerated using a more effective alternation strategy [52]: the idea is to update W several times before updating H because the products HH T and XH T do not need to be recomputed.…”
Section: Multiplicative Updatesmentioning
confidence: 99%
“…The popularity of this algorithm came along with the popularity of NMF. Algorithm 2 does not guarantee convergence to a stationary point (although it can be slightly modified in order to get this property [29,16]) and it has been observed to converge relatively slowly, see [20] and the references therein.…”
Section: Multiplicative Updates (Mu)mentioning
confidence: 99%