Cyber security is a high-ranking national priority that is only likely to grow as we become more dependent on cyber systems. From a research perspective, currently available work often focuses solely on technological aspects of cyber, acknowledging the human in passing, if at all. In recent years, the Human Factors community has begun to address human-centered issues in cyber operations, but in comparison to technological communities, we have only begun to scratch the surface. Even with publications on cyber human factors gaining momentum, there still exists a major gap in the field between understanding of the domain and currently available research meant to address relevant issues. The purpose for this panel is to continue to expand the role of human factors in cyber research by introducing the community to current work being done, and to facilitate collaborations to drive future research. We have assembled a panel of scientists across multiple specializations in the human factors community to have an open discussion regarding how to leverage previous human factors research and current work in cyber operations to continue to push the bounds of the field.
Operators of automated systems can develop complacency, impairing their ability to respond in a timely and appropriate fashion when automation fails. This study sought to examine the impact of an instructional manipulation of attentiveness, in terms of engagement and accountability, on fault diagnosis and fault management. Participants trained on the operation of a simulated process control task, with instructions varied to induce higher or lower attentiveness to the task. After several routine faults within the system, a fault occurred along with a failure of a previously available diagnostic and management aid, and shortly after a second failure occurred. The first failure was associated with significant impairment of diagnosis and management, but comparatively few differences between attentiveness groups. Results are discussed in terms of their implications for a model of human-automation interaction.
Computer network defense analysts engage a difficult, though critical, task in cyber defense. Anecdotally, these operators complain of frequent task interruptions while they are performing their duties. The goal for the current study was to investigate the effect of a commonly reported interruption, answering email, on accuracy and completion times in a simulated network analyst task. During task trials, participants were interrupted by emails between alert investigations, during alert investigations, or not at all (control). The results indicated that email interruptions increased alert completion times regardless of when they occurred, but interruptions that occurred during an alert investigation also reduced the accuracy of subsequent judgments about alert threat. Overall, the results suggest that task interruptions can potentially undermine cyber defense, and steps should be taken to better quantify and mitigate this threat.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.