Mass shootings, like other extreme events, have long garnered public curiosity and, in turn, significant media coverage. The media framing, or topic focus, of mass shooting events typically evolves over time from details of the actual shooting to discussions of potential policy changes (e.g., gun control, mental health). Such media coverage has been historically provided through traditional media sources such as print, television, and radio, but the advent of online social networks (OSNs) has introduced a new platform for accessing, producing, and distributing information about such extreme events. The ease and convenience of OSN usage for information within society's larger growing reliance upon digital technologies introduces potential unforeseen risks. Social bots, or automated software agents, are one such risk, as they can serve to amplify or distort potential narratives associated with extreme events such as mass shootings. In this paper, we seek to determine the prevalence and relative importance of social bots participating in OSN conversations following mass shooting events using an ensemble of quantitative techniques. Specifically, we examine a corpus of more than 46 million tweets produced by 11.7 million unique Twitter accounts within OSN conversations discussing four major mass shooting events: the 2017 Las Vegas concert shooting, the 2017 Sutherland Springs church chooting, the 2018 Parkland School Shooting and the 2018 Santa Fe school shooting. This study's results show that social bots participate in and contribute to online mass shooting conversations in a manner that is distinguishable from human contributions. Furthermore, while social bots accounted for fewer than 1% of total corpus user contributors, social network analysis centrality measures identified many bots with significant prominence in the conversation networks, densely occupying many of the highest eigenvector and out-degree centrality measure rankings, to include 82% of the top-100 eigenvector values of the Las Vegas retweet network.