In this paper we give a brief historical review of the evolution of device reliability research over the past decades. Then we give some examples on how established characterization techniques that were developed for silicon based devices can be completely misinterpreted when applied to Ge or III-V based MOS-structures, and how a simple modification of the technique can ensure a correct interpretation. We also show how novel techniques, such as TSCIS (Trap spectroscopy by Charge Injection and Sensing). were developed recently to overcome the problem of dielectric material screening for logic and memory applications. With the scaling of the devices into the nanometer regime single traps are causing large variations in the device parameters, which leads to a time-dependent variability, which makes lifetime analysis difficult. Finally we show that when using the classical reliability assessment methodology based on accelerated testing, the available reliability margins are strongly reduced, in some cases even down to zero, especially for sub-1nm EOT (Effective Oxide Thickness) devices. As a result, we argue that the reliability community will have to look for alternative ways to ensure and guarantee the lifetime of future products.