We present a methodological workflow using two open science tools that we developed. The first, FBAdLibrian, collects images from the Facebook Ad Library. The second, Pykognition, simplifies facial and emotion detection in images using computer vision. We provide a methodological workflow for using these tools and apply them to a case study of the 2020 US primary elections. We find that unique images of campaigning candidates are only a fraction (<.1%) of overall ads. Furthermore, we find that candidates most often display happiness and calm in their facial expressions, and they rarely attack opponents in image-based ads from their official Facebook pages. When candidates do attack, opponents are portrayed with emotions such as anger, sadness, and fear.