We extend the geometrical inverse approximation approach for solving linear least-squares problems. For that we focus on the minimization of 1 − cos(X(A T A), I), where A is a given rectangular coefficient matrix and X is the approximate inverse. In particular, we adapt the recently published simplified gradient-type iterative scheme MinCos to the least-squares scenario. In addition, we combine the generated convergent sequence of matrices with well-known acceleration strategies based on recently developed matrix extrapolation methods, and also with some deterministic and heuristic acceleration schemes which are based on affecting, in a convenient way, the steplength at each iteration. A set of numerical experiments, including large-scale problems, are presented to illustrate the performance of the different accelerations strategies.