Based on S-parameter measurements, the effect of dynamic trapping and de-trapping of charge in the gate oxide, the increase of dielectric loss due to polarization, and the impact of leakage current on the small-signal input impedance at RF is analyzed and represented. This is achieved by systematically extracting the corresponding model parameters from single device measurements at different frequency ranges, and then the methodology is applied to analyze the evolution of these parameters when the device is submitted to non-conducting electrical stress. This approach not only allows to inspect the impact of effects not occurring under DC conditions, such as the current due to the time varying dielectric polarization, but also to clearly distinguish effects in accordance with the functional form of their contribution to the device’s impedance. In fact, it is shown that minor changes in the model of the gate capacitance by including additional resistive and capacitive components allows for an excellent model-experiment correlation up to 30 GHz. Moreover, the accuracy of the correlation is shown to be maintained when applying the proposal to the device under different gate-to-source bias conditions and at several stages during off-state degradation.