During the last few years, the accuracy of static electricity meters (SEM) has been questioned. Significant metering deviations with respect to a reference meter have been observed at customer premises, and laboratory experimental tests results support such findings. The root cause of such errors remains unknown, as there are multiple elements that could affect the accuracy of electricity meters. Furthermore, standard compliant meters exposed to distorted signals may produce negligible, positive or negative relative error depending on the instrument design. Distorted current signals with fast amplitude transitions have produced the highest error in SEMs reported in the literature. In this paper, the accuracy of an energy metering Integrated Circuit (IC) is evaluated beyond the limits of the standards requirements employing a selection of distorted signals from the standards, real-world captured signals and a set of waveforms designed to test the IC under fast changing currents conditions, which are representative of the waveforms resulting from power electronic devices. The experimental results reveal an accuracy boundary imposed by Gibb’s phenomenon for fast changing current signals and a strong relationship between the IC’s measurement error and two key parameters of the measured waveform: signal slope and phase angle. This paper therefore provides a methodology for the comprehensive analysis of SEMs in future power systems which are dominated with power electronic-controlled electrical demand and contributes to the search for the root cause of error in SEMs exposed to distorted waveforms.