Although IoT delivers several benefits, it also raises concerns regarding privacy and security, from revenue disruption in industrial facilities to life-threatening situations caused by smart houses hacking. As a consequence, anomaly detection algorithms stand out to improve data reliability. However, little has been said about the implications of running these computationally expensive programs in hardware-constrained edge devices. Therefore, in this paper, we present an evaluation of six anomaly detection algorithms running in an edge device regarding performance, accuracy, temperature, and power consumption. The results showed that time complexity, resources demand, and detection approach directly impact on the feasibility of running anomaly detection algorithms in edge devices. Based on these results, we present a recommendation on which algorithms would best satisfy the requirements of several IoT environments.