Low rank approximation of a matrix (hereafter LRA) is a highly important area of Numerical Linear and Multilinear Algebra and Data Mining and Analysis with numerous important applications to modern computations. One can operate with LRA of a matrix at sub-linear cost, that is, by using much fewer memory cells and flops than the matrix has entries, 1 but no sub-linear cost algorithm can compute accurate LRA of the worst case input matrices or even of the matrices of small families of low rank matrices in our Appendix B. Nevertheless we prove that some old and new sub-linear cost algorithms can solve the dual LRA problem, that is, with a high probability (hereafter whp) compute close LRA of a random matrix admitting LRA. Our tests are in good accordance with our formal study, and we have extended our progress into various directions, in particular to dual Linear Least Squares Regression at sub-linear cost.