The need for low power consumption in highly integrated systems-on-chip (SoCs), such as IoT-based smoke detection systems, has made frequent power-ups and power-downs a common practice. Although the device performance degradation caused by such frequent power-ups and power-downs is ignored in most circuit studies, in high-precision bandgap voltage references, the degradation of particular devices can result in a reference voltage shift. This work investigates the effects of frequent power-ups and power-downs on a simple bandgap reference circuit and demonstrates that the effects are real and non-negligible. Aging simulations based on simple bandgap reference circuits are performed to analyze the causes of device performance degradation and circuit output voltage shifts. The simulation results show that frequent power-ups and power-downs induce the negative bias temperature instability (NBTI) effect, a phenomenon that causes performance degradation in devices, such as threshold voltage degradation, under negative bias and high temperature conditions. Further, gate voltage waveforms of these devices were extracted for aging tests and the results of the tests in 14 nm FinFET and the NBTI aging model were used to infer the threshold voltage shift after 10 years at 125 °C. A circuit modification is proposed to mitigate the degradation of device performance and reference voltage shift. This work indicates that NBTI stresses introduced by frequent power-ups and power-downs need to be considered in circuit design.