The orthogonal least squares (OLS) algorithm is popularly used in sparse recovery, subset selection, and function approximation. In this paper, we analyze the performance guarantee of OLS. Specifically, we show that if a sampling matrix Φ has unit ℓ2-norm columns and satisfies the restricted isometry property (RIP) of order K + 1 withthen OLS exactly recovers any K-sparse vector x from its measurements y = Φx in K iterations. Furthermore, we show that the proposed guarantee is optimal in the sense that OLS may fail the recovery under δK+1 ≥ CK . Additionally, we show that if the columns of a sampling matrix are ℓ2-normalized, then the proposed condition is also an optimal recovery guarantee for the orthogonal matching pursuit (OMP) algorithm. Also, we establish a recovery guarantee of OLS in the more general case where a sampling matrix might not have unit ℓ2-norm columns. Moreover, we analyze the performance of OLS in the noisy case. Our result demonstrates that under a suitable constraint on the minimum magnitude of nonzero elements in an input signal, the proposed RIP condition ensures OLS to identify the support exactly.
Index Terms-Orthogonal least squares (OLS), orthogonal matching pursuit (OMP), restricted isometry property (RIP), sparse recoveryOn the other hand, it has been reported in [6] that when K = 2, there exists a counterexample for which OLS fails 1 √