The timing and duration of flowering are key agronomic traits that are often associated with the ability of a variety to escape abiotic stress such as heat and drought. Flowering information is valuable in both plant breeding and agricultural production management. Visual assessment, the standard protocol used for phenotyping flowering, is a low-throughput and subjective method. In this study, we evaluated multiple imaging sensors (RGB and multiple multispectral cameras), image resolution (proximal/remote sensing at 1.6 to 30 m above ground level/AGL), and image processing (standard and unsupervised learning) techniques in monitoring flowering intensity of four cool-season crops (canola, camelina, chickpea, and pea) to enhance the accuracy and efficiency in quantifying flowering traits. The features (flower area, percentage of flower area with respect to canopy area) extracted from proximal (1.6–2.2 m AGL) RGB and multispectral (with near infrared, green and blue band) image data were strongly correlated (r up to 0.89) with visual rating scores, especially in pea and canola. The features extracted from unmanned aerial vehicle integrated RGB image data (15–30 m AGL) could also accurately detect and quantify large flowers of winter canola (r up to 0.84), spring canola (r up to 0.72), and pea (r up to 0.72), but not camelina or chickpea flowers. When standard image processing using thresholds and unsupervised machine learning such as k-means clustering were utilized for flower detection and feature extraction, the results were comparable. In general, for applicability of imaging for flower detection, it is recommended that the image data resolution (i.e., ground sampling distance) is at least 2–3 times smaller than that of the flower size. Overall, this study demonstrates the feasibility of utilizing imaging for monitoring flowering intensity in multiple varieties of evaluated crops.