Inferring the properties of black holes and neutron stars is a key science goal of gravitational-wave (GW) astronomy. To extract as much information as possible from GW observations, we must develop methods to reduce the cost of Bayesian inference. In this paper, we use artificial neural networks (ANNs) and the parallelization power of graphics processing units (GPUs) to improve the surrogate modeling method, which can produce accelerated versions of existing models. As a first application of our method, the artificial neural networks surrogate model (ANN-Sur), we build a time-domain surrogate model of the spinaligned binary black hole (BBH) waveform model SEOBNRv4. We achieve median mismatches of approximately 2e − 5 and mismatches no worse than approximately 2e − 3. For a typical BBH waveform generated from 12 Hz with a total mass of 60 M ⊙ , the original SEOBNRv4 model takes 1794 ms. Existing custom-made code optimizations (SEOBNRv4opt) reduced this to 83.7 ms, and the interpolation-based, frequency-domain surrogate SEOBNRv4ROM can generate this waveform in 3.5 ms. Our ANN-Sur model when run on a CPU takes 1.2 ms and when run on a graphics processing unit (GPU) takes just 0.5 ms. ANN-Sur can also generate large batches of waveforms simultaneously. We find that batches of up to 10 3 waveforms can be evaluated on a GPU in just 1.57 ms, corresponding to a time per waveform of 0.0016 ms. This method is a promising way to utilize the parallelization power of GPUs to drastically increase the computational efficiency of Bayesian parameter estimation.