Infrared thermography is considered a useful technique for diagnosing several skin pathologies but it has not been widely adopted mainly due to its high cost. Here, we investigate the feasibility of using low-cost infrared cameras with microbolometer technology for detecting skin cancer. For this purpose, we collected infrared data from volunteer subjects using a high-cost/high-quality infrared camera. We propose a degradation model to assess the use of lower-cost imagers in such a task. The degradation model was validated by mimicking video acquisition with the low-cost cameras, using data originally captured with a medium-cost camera. The outcome of the proposed model was then compared with the infrared video obtained with actual cameras, achieving an average Pearson correlation coefficient of more than 0.9271. Therefore, the model successfully transfers the behavior of cameras with poorer characteristics to videos acquired with higher-quality cameras. Using the proposed model, we simulated the acquisition of patient data with three different lower-cost cameras, namely, Xenics Gobi-640, Opgal Therm-App, and Seek Thermal CompactPRO. The degraded data were used to evaluate the performance of a skin cancer detection algorithm. The Xenics and Opgal cameras achieved accuracies of 84.33% and 84.20%, respectively, and sensitivities of 83.03% and 83.23%, respectively. These values closely matched those from the non-degraded data, indicating that employing these lower-cost cameras is appropriate for skin cancer detection. The Seek camera achieved an accuracy of 82.13% and a sensitivity of 79.77%. Based on these results, we conclude that this camera is appropriate for less critical applications.