We propose a novel hybrid algorithm "Brent-STEP" for univariate global function minimization, based on the global line search method STEP and accelerated by Brent's method, a local optimizer that combines quadratic interpolation and golden section steps. We analyze the performance of the hybrid algorithm on various one-dimensional functions and experimentally demonstrate a significant improvement relative to its constituent algorithms in most cases. We then generalize the algorithm to multivariate functions, adopting the recently proposed [8] scheme to interleave evaluations across dimensions to achieve smoother and more efficient convergence. We experimentally demonstrate the highly competitive performance of the proposed multivariate algorithm on separable functions of the BBOB benchmark. The combination of good performance and smooth convergence on separable functions makes the algorithm an interesting candidate for inclusion in algorithmic portfolios or hybrid algorithms that aim to provide good performance on a wide range of problems.