Evaluation is central in research and development of information retrieval (IR). In addition to designing and implementing new retrieval mechanisms, one must also show through rigorous evaluation that they are effective. A major focus in IR is IR mechanisms' capability of ranking relevant documents optimally for the users, given a query. Searching for information in practice involves searchers, however, and is highly interactive. When human searchers have been incorporated in evaluation studies, the results have often suggested that better ranking does not necessarily lead to better search task, or work task, performance. Therefore, it is not clear which system or interface features should be developed to improve the effectiveness of human task performance. In the present article, we focus on the evaluation of task-based information interaction (TBII). We give special emphasis to learning tasks to discuss TBII in more concrete terms. Information interaction is here understood as behavioral and cognitive activities related to task planning, searching information items, selecting between them, working with them, and synthesizing and reporting. These five generic activities contribute to task performance and outcome and can be supported by information systems. In an attempt toward task-based evaluation, we introduce program theory as the evaluation framework. Such evaluation can investigate whether a program consisting of TBII activities and tools works and how it works and, further, provides a causal description of program (in)effectiveness. Our goal in the present article is to structure TBII on the basis of the five generic activities and consider the evaluation of each activity using the program theory framework. Finally, we combine these activity-based program theories in an overall evaluation framework for TBII. Such an evaluation is complex due to the large number of factors affecting information interaction. Instead of presenting tested program theories, we illustrate how the evaluation of TBII should be accomplished using the program theory framework in the evaluation of systems and behaviors, and their interactions, comprehensively in context.
We analyze barriers to task-based information access in molecular medicine, focusing on research tasks, which provide task performance sessions of varying complexity. Molecular medicine is a relevant domain because it offers thousands of digital resources as the information environment. Data were collected through shadowing of real work tasks. Thirty work task sessions were analyzed and barriers in these identified. The barriers were classified by their character (conceptual, syntactic, and technological) and by their context of appearance (work task, system integration, or system). Also, work task sessions were grouped into three complexity classes and the frequency of barriers of varying types across task complexity levels were analyzed. Our findings indicate that although most of the barriers are on system level, there is a quantum of barriers in integration and work task contexts. These barriers might be overcome through attention to the integrated use of multiple systems at least for the most frequent uses. This can be done by means of standardization and harmonization of the data and by taking the requirements of the work tasks into account in system design and development, because information access is seldom an end itself, but rather serves to reach the goals of work tasks.
In task-based information searching, the task at hand is a central factor affecting information search. Task complexity, in particular, has been discovered to affect searching. In the present study, we shadowed the tasks of seven people working in city administration. The data consist of shadowing field notes, voice recordings, photographs and forms. We study, how task complexity affects information searching and information resource use. Task complexity was defined through the task performer's own experience (perceived task complexity) and her estimates of her a priori knowledge concerning the task. We analyzed the data both qualitatively and quantitatively, focusing on the links between task complexity and the use of information resources, information searching and problems encountered. We found that task complexity has a central but ambiguous relationship to task performance. The clearest differences were found between simple and complex tasks. In addition, perceived task complexity seems to affect the ways of performing the task more than a priori knowledge. The more complex a task is perceived, the more searches are performed and the more they concentrate on networked resources instead of information systems provided by the organization (SPOs). The use of resources on the task performer's PC and the SPOs decreases when complexity increases. In proportion, the use of networked resources and communication resources increases. The total number of information resources used is somewhat greater in complex and semi-complex tasks than in simple tasks; and each resource is used for a longer time on average. Our study shows that task context and especially task complexity seems to affect information searching and the selection of sources.
In this article, we present a conceptual framework of information needs for taskbased information studies. The framework accounts for both vertical and horizontal relationships between information needs as fluid activities in work-task performance.As part of task performance, pieces of information are gathered from various, heterogeneous sources, not primarily to fulfil any expressed formulation of information needs, but in order to make progress in the task. The vertical relationships pinpoint connections between general and specific, from the workplace context to the interaction with an information source, and the horizontal relationships between parallel information needs. These relationships enrich the conceptual understanding of information needs in information studies, which previously has focussed on sequential relationships. The sequential, vertical and horizontal relationships form an analytical network that allows a departure from the black-box depiction of information needs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.