Emotional information is considered to convey much meaning in communication. Hence, artificial emotion categorization methods are being developed to meet the increasing demand to introduce intelligent systems, such as robots, into shared workspaces. Deep learning algorithms have demonstrated limited competency in categorizing images from posed datasets with the main features of the face being visible. However, the use of sunglasses and facemasks is common in our daily lives, especially with the outbreak of communicable diseases such as the recent coronavirus. Anecdotally, partial coverings of the face reduces the effectiveness of human communication, so would this have hampering effects on computer vision, and if so, would the different emotion categories be affected equally? Here, we use a modern deep learning algorithm (i.e. VGG19) to categorize emotion from faces of people obscured with simulated sunglasses and facemasks. We found that face coverings obscure emotion categorization by up to 74%, whereby emotion categories are affected differently by different coverings, e.g. clear mouth coverings have little effect in categorizing happiness, but sadness is affected badly. While an overall accuracy of up to 97% has been achieved with nothing added to the face, the achieved accuracy decreases in all other cases when the face is obscured. Notably, clear visors have only a small effect across all emotions, where the classifier achieved an accuracy of up to 89.0% compared to other types of facemasks in which the achieved accuracy is less than 36%.