Surrogate-based optimization is widely used to deal with long-running black-box simulation-based objective functions. Actually, the use of a surrogate model such as Kriging or Artificial Neural Network allows to reduce the number of calls to the CPU time-intensive simulator. Bayesian optimization uses the ability of surrogates to provide useful information to help guiding effectively the optimization process. In this paper, the Efficient Global Optimization (EGO) reference framework is challenged by a Bayesian Neural Network-assisted Genetic Algorithm, namely BNN-GA. The Bayesian Neural Network (BNN) surrogate is chosen for its ability to provide an uncertainty measure of the prediction that allows to compute the Expected Improvement of a candidate solution in order to improve the exploration of the objective space. BNN is also more reliable than Kriging models for high-dimensional problems and faster to set up thanks to its incremental training. In addition, we propose a batch-based approach for the parallelization of BNN-GA that is challenged by a parallel version of EGO, called q-EGO. Parallel computing is a highly important complementary way (to surrogates) to deal with the computational burden of simulation-based optimization. The comparison of the two parallel approaches is experimentally performed through several benchmark functions and two real-world problems within the scope of Tuberculosis Transmission Control (TBTC). The study presented in this paper proves that parallel batched BNN-GA is a viable alternative to q-EGO approaches being more suitable for high-dimensional problems, parallelization impact, bigger databases and moderate search budgets. Moreover, a significant improvement of the solutions is obtained for the two TBTC problems tackled.