Principal component analysis (PCA) is a powerful tool for dimensionality reduction. Unfortunately, it is sensitive to outliers, so that various robust PCA variants were proposed in the literature. One of the most frequently applied methods for high dimensional data reduction is the rotational invariant L 1 -norm PCA of Ding and coworkers. So far no convergence proof for this algorithm was available. The main topic of this paper is to fill this gap. We reinterpret this robust approach as a conditional gradient algorithm and show moreover that it coincides with a gradient descent algorithm on Grassmannian manifolds. Based on the latter point of view, we prove global convergence of the whole series of iterates to a critical point using the Kurdyka-Lojasiewicz property of the objective function, where we have to pay special attention to so-called anchor points, where the function is not differentiable.