Many real problems have been solved by support vector regression, especially v‐support vector regression (v‐SVR), but there are hyperparameters that usually needed to tune. In addition, v‐SVR cannot perform feature selection. Nature‐inspired algorithms have been used as a feature selection and as hyperparameters estimation procedure. In this paper, the opposition‐based learning Harris hawks optimization algorithm (HHOA‐OBL) is proposed to optimize the hyperparameters of the v‐SVR with embedding the feature selection simultaneously. The experimental results over four datasets show that the HHOA‐OBL outperforms the standard Harris hawks optimization algorithm, grid search, and cross‐validation methods, in terms of prediction, number of selected features, and running time. Besides, the experimental results of the HHOA‐OBL confirm the efficiency of the proposed algorithm in improving the prediction performance and computational time compared with other nature‐inspired algorithms, which proves the ability of HHOA‐OBL in searching for the best hyperparameters values and selecting the most informative features for prediction tasks. Thus, the experiments and comparisons support the performance of the proposed approach in making predictions in other real applications.