Introduction: The learning opportunities for global health professionals have expanded rapidly in recent years. The diverse array of learners and wide range in course quality underscore the need for an improved course vetting process to better match learners with appropriate learning opportunities. Methods: We developed a structured tool to assess overall course quality by determining performance across four defined domains Relevance, Engagement, Access, and Pedagogy (REAP). We applied this tool across a learning catalogue developed for participants enrolled in the Sustaining Technical and Analytic Resources (STAR) project, a global health leadership training program. Results: The STAR learning activities database included a total of 382 courses, workshops, and web-based resources which fulfilled 531 competencies across three levels: core, content, and skill. Relevance: The majority of activities were at an understanding or practicing level across all competency domains (486/531, 91.5%). Engagement: Many activities lacked any peer engagement (202/531, 38.0%) and had limited to no faculty engagement (260/531, 49.0%). Access: The plurality of courses across competencies were offered on demand (227/531, 42.7%) and were highly flexible in pace (240/531, 45.2%). Pedagogy: Of the activities that included an assessment, most matched activity learning objectives (217/531, 40.9%). Discussion: Through applying REAP to the STAR project learning catalogue, we found many online activities lacked meaningful engagement with faculty and peers. Further development of structured online activities providing learners with flexibility in access, a range of levels of advancement for content, and opportunities to engage and apply learning are needed for the field of global health.