Summary
This article presents a novel application of an acquisition function‐based Bayesian optimization (BO) method to optimally tune the hyperparameters of soft margin support vector machine (SVM) for structural damage detection. The performance of SVM classifiers is known to depend heavily on the selection of hyperparameters. The optimal hyperparameters yield the minimum cross‐validation error. However, no explicit functional relationship exists between the hyperparameters and the cross‐validation error which can be used as an objective function. This BO approach uses the Bayes rule along with the concept of Gaussian process regression to estimate the objective function and an acquisition function to efficiently search the hyperparameter space. The performance of this methodology has been validated by training an optimally tuned SVM to carry out damage detection on an experimental benchmark structure using four different datasets. It was observed that the number of function evaluations to calculate the optimal values of hyperparameters turned out to be below 25 for each of the datasets, which is typically the number of function evaluations required (i.e., the swarm size) in each iteration of particle swarm optimization (PSO). A comparison of BO with PSO in terms of the number of function evaluations revealed its efficiency for the benchmark problem. This result signifies the practical importance of BO in speeding up the process of hyperparameter selection. Finally, some interesting observations regarding the obtained values of box constraint, support‐vector‐location, and the relationship between the convergence curve of BO and learning curves of SVM are discussed.