Assessing and understanding intelligent agents is a difficult task for users that lack an AI background. A relatively new area, called "Explainable AI," is emerging to help address this problem, but little is known about how users would forage through information an explanation system might offer. To inform the development of Explainable AI systems, we conducted a formative study -using the lens of Information Foraging Theory -into how experienced users foraged in the domain of StarCraft to assess an agent. Our results showed that participants faced difficult foraging problems. These foraging problems caused participants to entirely miss events that were important to them, reluctantly choose to ignore actions they did not want to ignore, and bear high cognitive, navigation, and information costs to access the information they needed. sign further in the Methodology section.) In addition, the participants had functionality to seek additional information about the replay, such as navigating around the game map, drilling down into production information, pausing, rewinding, fast-forwarding, and so on (Figure 1). However, we wanted a higher level of abstraction than features specific to StarCraft. Specifically, we aimed for (1) applicability to other RTS environments, and (2) connection with other research about humans seeking information. To that end, we turned to Information Foraging Theory (IFT).