Random Telegraph Noise is a time-dependent variability phenomenon that has gained increased attention during the last years, especially in deeply-scaled technologies. In particular, there is a wide variety of works presenting different techniques designed to analyze current traces in scaled FET devices displaying Random Telegraph Noise, and others focused on modeling the phenomenon using the parameters extracted through such techniques. However, very little attention has been paid to the effects that the biasing conditions of the transistors prior to the measurements may have on the extraction of the parameters that characterize this phenomenon. This paper investigates how these biasing conditions actually impact the extracted results. Specifically, it is demonstrated that the results obtained when Random Telegraph Noise is measured immediately after the device is biased may lead to an overestimation of the Random Telegraph Noise impact with respect to situations in which the device has been previously biased for some time. This fact is, first, presented from a theoretical point of view, and, after, demonstrated experimentally through measurements obtained from a CMOS-transistor array.