Estimating the energy and memory consumption of machine learning(ML) models for intrusion detection ensures efficient allocation of system resources. This study investigates the impact of supervised ML algorithms on the energy and memory consumption of intrusion detection systems. Experiments are conducted with seven ML algorithms and a proposed ensemble model, utilizing two intrusion detection datasets. Pearson correlation coefficient(PCC) and Spearman correlation coefficient are employed for the selection of optimum features. Regarding energy consumption, the findings reveal that the PCC with the UNSW-NB15 dataset uses the least amount of DRAM and CPU power. For ML methods, SVM utilizes the highest energy for both feature selection methods and datasets. Concerning memory consumption, the results show that decision tree uses the most current memory with PCC on the UNSW-NB15. The proposed ensemble model demonstrates the highest performance. These findings offer practical guidelines to ML experts when choosing the optimum model with the most efficient utilization of energy and memory.