It is well known that it is difficult to construct minimax optimal designs. Furthermore, since in practice we never know the true error variance, it is important to allow small deviations and construct robust optimal designs. We investigate a class of minimax optimal regression designs for models with heteroscedastic errors that are robust against possible misspecification of the error variance. Commonly used A‐, c‐, and I‐optimality criteria are included in this class of minimax optimal designs. Several theoretical results are obtained, including a necessary condition and a reflection symmetry for these minimax optimal designs. In this article, we focus mainly on linear models and assume that an approximate error variance function is available. However, we also briefly discuss how the methodology works for nonlinear models. We then propose an effective algorithm to solve challenging nonconvex optimization problems to find minimax designs on discrete design spaces. Examples are given to illustrate minimax optimal designs and their properties.