This paper presents a joint time-delay and channel estimator to assess the achievable positioning performance of the Long Term Evolution (LTE) system in multipath channels. LTE is a promising technology for localization in urban and indoor scenarios, but its performance is degraded due to the effect of multipath. In those challenging environments, LTE pilot signals are of special interest because they can be used to estimate the multipath channel and counteract its effect. For this purpose, a channel estimation model based on equi-spaced taps is combined with the time-delay estimation, leading to a low-complexity estimator. This model is enhanced with a novel channel parameterization able to characterize close-in multipath, by introducing an arbitrary tap with variable position between the first two equi-spaced taps. This new hybrid approach is adopted in the joint maximum likelihood (JML) time-delay estimator to improve the ranging performance in the presence of short-delay multipath. The JML estimator is then compared with the conventional correlation-based estimator in usual LTE conditions. These conditions are characterized by the extended typical urban (ETU) multipath channel model, additive white Gaussian noise (AWGN) and LTE signal bandwidths equal to 1.4, 5 and 10 MHz. The resulting time-delay estimation performance is assessed by computing the cumulative density function (CDF) of the errors in the absence of noise and the root-mean-square error (RMSE) and bias for signal-to-noise ratio (SNR) values between −20 and 30 dB.