We estimate the root-mean-square (RMS) value of timing jitter noise in simulated signals similar to measured highspeed sampled signals. The simulated signals are contaminated by additive noise, timing jitter noise, and time shift errors. Before estimating the RMS value of the jitter noise, we align the signals (unless there are no time shift errors) based on estimates of the relative shifts from cross-correlation analysis. We compute the mean and sample variance of the aligned signals based on repeated measurements at each time sample. We estimate the derivative of the noise-free signal based, in part, on a regression spline fit to the average of the aligned signals. Our initial estimate of the RMS value of the jitter noise depends on estimated derivatives and sample variances at time samples where the magnitude of the estimated derivative exceeds a selected threshold. This initial estimate is generally biased. Using a parametric bootstrap approach, we adaptively adjust this initial estimate of the RMS value of the jitter noise based on an estimate of this bias. We apply our method to real data collected at NIST. We study how results depend on the derivative threshold.