In robust regression we often have to decide how many are the unusual observations, which should be removed from the sample in order to obtain better fitting for the rest of the observations. Generally, we use the basic principle of LTS, which is to fit the majority of the data, identifying as outliers those points that cause the biggest damage to the robust fit. However, in the LTS regression method the choice of default values for high break down-point affects seriously the efficiency of the estimator. In the proposed approach we introduce penalty cost for discarding an outlier, consequently, the best fit for the majority of the data is obtained by discarding only catastrophic observations. This penalty cost is based on robust design weights and high break down-point residual scale taken from the LTS estimator. The robust estimation is obtained by solving a convex quadratic mixed integer programming problem, where in the objective function the sum of the squared residuals and penalties for discarding observations is minimized. The proposed mathematical programming formula is suitable for small-sample data. Moreover, we conduct a simulation study to compare other robust estimators with our approach in terms of their efficiency and robustness.