Deep learning models are proficient at predicting target classes, but they need to explain their predictions. Explainable Artificial Intelligence (XAI) offers a promising solution by providing both transparency and object detection capabilities to classification models. Mask detection plays a crucial role in ensuring the safety and well-being of individuals by preventing the spread of infectious diseases. A new visual XAI method called HayCAM+ is proposed to address the limitations of the previous method known as HayCAM, such as the need to select the number of filters as a hyper-parameter and the use of fully-connected layers. When object detection is performed using activation maps created via various methods, including GradCAM, EigenCAM, GradCAM++, LayerCAM, HayCAM, and HayCAM+, it is found that HayCAM+ provides the best results with an IoU score of 0.3740