Data overload is a generic and tremendously difficult problem that has only grown with each new wave of technological capabilities. As a generic and persistent problem, three observations are in need of explanation: Why is data overload so difficult to address? Why has each wave of technology exacerbated, rather than resolved, data overload? How are people, as adaptive responsible agents in context, able to cope with the challenge of data overload?In this paper, first we examine three different characterizations that have been offered to capture the nature of the data overload problem and how they lead to different proposed solutions. As a result, we propose that (a) data overload is difficult because of the context sensitivity problem -meaning lies, not in data, but in relationships of data to interests and expectations and (b) new waves of technology exacerbate data overload when they ignore or try to finesse context sensitivity. The paper then summarizes the mechanisms of human perception and cognition that enable people to focus on the relevant subset of the available data despite the fact that what is interesting depends on context.By focusing attention on the root issues that make data overload a difficult problem and on people's fundamental competence, we have identified a set of constraints that all potential solutions must meet. Notably among these constraints is the idea that organization precedes selectivity. These constraints point toward regions of the solution space that have been little explored. In order to place data in context, designers need to display data in a conceptual space that depicts the relationships, events, and contrasts that are informative in a field of practice.KEYWORDS: agent, alarm, context, data overload, information visualization, workload Cognition, Technology and Work, in press 3 DATA OVERLOAD IS A GENERIC, DIFFICULT PROBLEM Information is not a scarce resource. Attention is.Herbert Simon 1Each round of technical advances, whether in artificial intelligence, computer graphics, or electronic connectivity promises to help people better understand and manage a whole host of activities from financial analysis to monitoring data from space missions to controlling the national air space. Certainly, this ubiquitous computerization of the modern world has tremendously advanced our ability to collect, transmit and transform data producing unprecedented levels of access to data.However, our ability to interpret this avalanche of data, i.e., to extract meaning from artificial fields of data, has expanded much more slowly, if at all. In studies across multiple settings, we find that practitioners are bombarded with computer processed data, especially when anomalies occur. We find users lost in massive networks of computer based displays, options and modes. For example, one can find a version of the following statement in most accident investigation reports: "although all of the necessary data was physically available, it was not operationally effective. No one could assemble...