Random Telegraph Noise (RTN) adversely impacts circuit performance and this impact increases for smaller devices and lower operation voltage. To optimize circuit design, many efforts have been made to model RTN. RTN is highly stochastic, with significant device-to-device variations. Early works often characterize individual traps first and then group them together to extract their statistical distributions. This bottom-up approach suffers from limitations in the number of traps it is possible to measure, especially for the capture and emission time constants, calling the reliability of extracted distributions into question. Several compact models have been proposed, but their ability to predict long term RTN is not verified. Many early works measured RTN only for tens of seconds, although a longer time window increases RTN by capturing slower traps. The aim of this work is to propose an integral methodology for modelling RTN and, for the first time, to verify its capability of predicting the long term RTN. Instead of characterizing properties of individual traps/devices, the RTN of multiple devices were integrated to form one dataset for extracting their statistical properties. This allows using the concept of effective charged traps (ECT) and transforms the need for time constant distribution to obtaining the kinetics of ECT, making long term RTN prediction similar to predicting ageing. The proposed methodology opens the way for assessing RTN impact within a window of 10 years by efficiently evaluating the probability of a device parameter at a given level.