2000
DOI: 10.1002/1097-4571(2000)9999:9999<::aid-asi1042>3.0.co;2-7
|View full text |Cite
|
Sign up to set email alerts
|

Reflections on Mira: Interactive evaluation in information retrieval

Abstract: Evaluation in information retrieval (IR) has focussed largely on noninteractive evaluation of text retrieval systems. This is increasingly at odds with how people use modern IR systems: in highly interactive settings to access linked, multimedia information. Furthermore, this approach ignores potential improvements through better interface design. In 1996, the Commission of the European Union Information Technologies Programme funded a 3‐year working group, Mira, to discuss and advance research in the area of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0
1

Year Published

2002
2002
2007
2007

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(19 citation statements)
references
References 27 publications
0
18
0
1
Order By: Relevance
“…The user's judgement of relevance is naturally based on their current context, their preferences, and also their way of judging the semantic content of the images (e.g. [23,7]). Initially, the system uses the low-level image features as a quick way to 'estimate' the relevance values of the images.…”
Section: Relevance Feedbackmentioning
confidence: 99%
“…The user's judgement of relevance is naturally based on their current context, their preferences, and also their way of judging the semantic content of the images (e.g. [23,7]). Initially, the system uses the low-level image features as a quick way to 'estimate' the relevance values of the images.…”
Section: Relevance Feedbackmentioning
confidence: 99%
“…speed, layout, clearness, iteration (Robertson & HancockBeaulieu 1992, Saracevic 1995, Dunlop 2000. Therefore the data collected was rich for both 10 subjective and objective measures.…”
Section: Data Collection and Analysismentioning
confidence: 99%
“…User-centred evaluation is important to assess the overall success of a retrieval system which takes into account other factors other than just system performance, e.g. the design of the user interface and system speed (Dunlop argues this in [7]). A num-ber of researchers have highlighted the advantages of user-centred evaluation, particularly in image retrieval systems (see, e.g.…”
Section: Building a Test Collection For Multilingual Image Retrievalmentioning
confidence: 99%
“…A num-ber of researchers have highlighted the advantages of user-centred evaluation, particularly in image retrieval systems (see, e.g. [10], [14] and [7]). One of the main aims of ImageCLEF is to provide both the CLIR and image retrieval communities a number of useful resources (datasets and relevance assessments) to facilitate and promote further research in multilingual image retrieval.…”
Section: Building a Test Collection For Multilingual Image Retrievalmentioning
confidence: 99%