Time-of-Flight (TOF) based Light Detection and Ranging (LiDAR) is a widespread technique for distance measurements in both single-spot depth ranging and 3D mapping. Single Photon Avalanche Diode (SPAD) detectors provide single-photon sensitivity and allow in-pixel integration of a Time-to-Digital Converter (TDC) to measure the TOF of single-photons. From the repetitive acquisition of photons returning from multiple laser shots, it is possible to accumulate a TOF histogram, so as to identify the laser pulse return from unwelcome ambient light and compute the desired distance information. In order to properly predict the TOF histogram distribution and design each component of the LiDAR system, from SPAD to TDC and histogram processing, we present a detailed statistical modelling of the acquisition chain and we show the perfect matching with Monte Carlo simulations in very different operating conditions and very high background levels. We take into consideration SPAD non-idealities such as hold-off time, afterpulsing, and crosstalk, and we show the heavy pile-up distortion in case of high background. Moreover, we also model non-idealities of timing electronics chain, namely, TDC dead-time, limited number of storage cells for TOF data, and TDC sharing. Eventually, we show how the exploit the modelling to reversely extract the original LiDAR return signal from the distorted measured TOF data in different operating conditions.