We investigate properties and numerical algorithms for A-and D-optimal regression designs based on the second-order least squares estimator (SLSE). Several results are derived, including a characterization of the A-optimality criterion. We can formulate the optimal design problems under SLSE as semidefinite programming or convex optimization problems and we show that the resulting algorithms can be faster than more conventional multiplicative algorithms, especially in nonlinear models. Our results also indicate that the optimal designs based on the SLSE are more efficient than those based on the ordinary least squares estimator, provided the error distribution is highly skewed.