Historically, the implementation of research-based assessments (RBAs) has been a driver of educational change within physics and helped motivate adoption of interactive engagement pedagogies. Until recently, RBAs were given to students exclusively on paper and in-class; however, this approach has important drawbacks including decentralized data collection and the need to sacrifice class time.Recently, some RBAs have been moved to online platforms to address these limitations. Yet, online RBAs present new concerns such as student participation rates, test security, and students' use of outside resources. Here, we report on a study addressing these concerns in both upper-division and lower-division undergraduate physics courses. We gave RBAs to courses at five institutions; the RBAs were hosted online and featured embedded JavaScript code which collected information on students' behaviors (e.g., copying text, printing). With these data, we examine the prevalence of these behaviors, and their correlation with students' scores, to determine if online and paper-based RBAs are comparable. We find that browser loss of focus is the most common online behavior while copying and printing events were rarer. We found that correlations between these behaviors and student performance varied significantly between introductory and upper-division student populations, particularly with respect to the impact of students copying text in order to utilize internet resources. However, while the majority of students engaged in one or more of the targeted online behaviors, we found that, for our sample, none of these behaviors resulted in a significant change in the population's average performance that would threaten our ability to interpret this performance or compare it to paper-based implementations of the RBA.