Massive datasets pose a serious challenge to traditional statistical methods. Modal regression has greater robustness and high inference efficiency compared to mean regression and likelihood-based methods. In this paper, we present a robust distributed least squares approximation (RDLSA) method for heterogeneously distributed massive datasets under the modal regression framework. Specifically, we first approximate the local objective function for each worker/server/machine by using the Taylor expansion, where it is necessary to remain the main quadratic term. Then, each of local machines calculate their corresponding estimates and upload them to a master machine for obtaining the approximated aggregated estimator. This process yields a one-step global estimator such that one round of communication is required. Correspondingly, the consistency and asymptotic normality of the estimator are rigorously established under some mild conditions. To further reduce the estimation bias, we perform one Newton-Raphson iteration to obtain a two-step global aggregated estimator. In addition, we develop a variable selection procedure for the distributed modal regression under the robust least squares approximation procedure. Finally, we provide simulation experiments and a real data application to verify the superiority of our method.