Time-delay estimation (TDE) is a common operation in ultrasound signal processing. In applications such as blood flow estimation, elastography, phase aberration correction, and many more, the quality of final results is heavily dependent upon the performance of the time-delay estimator implemented.In the past years, several algorithms have been developed and applied in medical ultrasound, sonar, radar, and other fields. In this paper we analyze the performances of the widely used normalized and non-normalized correlations, along with normalized covariance, sum absolute differences (SAD), sum squared differences (SSD), hybrid-sign correlation, polarity-coincidence correlation, and the Meyr-Spies method. These techniques have been applied to simulated ultrasound radio frequency (RF) data under a variety of conditions. We show how parameters, which include center frequency, fractional bandwidth, kernel window size, signal decorrelation, and signal-to-noise ratio (SNR) affect the quality of the delay estimate. Simulation results also are compared with a theoretical performance limit set by the Cramér-Rao lower bound (CRLB).Results show that, for high SNR, high signal correlation, and large kernel size, all of the algorithms closely match the theoretical bound, with relative performances that vary by as much as 20%. As conditions degrade, the performances of various algorithms differ more significantly. For signals with a correlation level of 0.98, SNR of 30 dB, center frequency of 5 MHz with a fractional bandwidth of 0.5, and kernel size of 2 s, the standard deviation of the jitter error is on the order of few nanoseconds. Normalized correlation, normalized covariance, and SSD have an approximately equal jitter error of 2.23 ns (the value predicted by the CRLB is 2.073 ns), whereas the polarity-coincidence correlation performs less well with a jitter error of 2.74 ns.