In this work, we propose methods for speeding up linear regression distributively, while ensuring security. We leverage randomized sketching techniques, and improve straggler resilience in asynchronous systems. Specifically, we apply a random orthonormal matrix and then subsample blocks, to simultaneously secure the information and reduce the dimension of the regression problem. In our setup, the transformation corresponds to an encoded encryption in an approximate gradient coding scheme, and the subsampling corresponds to the responses of the non-straggling workers; in a centralized coded computing network. This results in a distributive iterative sketching approach for an ℓ2-subspace embedding, i.e. a new sketch is considered at each iteration. We also focus on the special case of the Subsampled Randomized Hadamard Transform, which we generalize to block sampling; and discuss how it can be modified in order to secure the data.