The paper presents our design of a next generation information retrieval system based on tag co-occurrences and subsequent clustering. We help users getting access to digital data through information visualization in the form of tag clusters. Current problems like the absence of interactivity and semantics between tags or the difficulty of adding additional search arguments are solved. In the evaluation, based upon SERVQUAL and IT systems quality indicators, we found out that tag clusters are perceived as more useful than tag clouds, are much more trustworthy, and are more enjoyable to use.
Some content in multimedia resources can depict or evoke certain emotions in users. The aim of Emotional Information Retrieval (EmIR) and of our research is to identify knowledge about emotional-laden documents and to use these findings in a new kind of World Wide Web information service that allows users to search and browse by emotion. Our prototype, called Media EMOtion SEarch (MEMOSE), is largely based on the results of research regarding emotive music pieces, images and videos. In order to index both evoked and depicted emotions in these three media types and to make them searchable, we work with a controlled vocabulary, slide controls to adjust the emotions’ intensities, and broad folksonomies to identify and separate the correct resource-specific emotions. This separation of so-called power tags is based on a tag distribution which follows either an inverse power law (only one emotion was recognized) or an inverse-logistical shape (two or three emotions were recognized). Both distributions are well known in information science. MEMOSE consists of a tool for tagging basic emotions with the help of slide controls, a processing device to separate power tags, a retrieval component consisting of a search interface (for any topic in combination with one or more emotions) and a results screen. The latter shows two separately ranked lists of items for each media type (depicted and felt emotions), displaying thumbnails of resources, ranked by the mean values of intensity. In the evaluation of the MEMOSE prototype, study participants described our EmIR system as an enjoyable Web 2.0 service
Purpose -The object of this empirical research study is emotion, as depicted and aroused in videos. This paper seeks to answer the questions: Are users able to index such emotions consistently? Are the users' votes usable for emotional video retrieval? Design/methodology/approach -The authors worked with a controlled vocabulary for nine basic emotions (love, happiness, fun, surprise, desire, sadness, anger, disgust and fear), a slide control for adjusting the emotions' intensity, and the approach of broad folksonomies. Different users tagged the same videos. The test persons had the task of indexing the emotions of 20 videos (reprocessed clips from YouTube). The authors distinguished between emotions which were depicted in the video and those that were evoked in the user. Data were received from 776 participants and a total of 279,360 slide control values were analyzed. Findings -The consistency of the users' votes is very high; the tag distributions for the particular videos' emotions are stable. The final shape of the distributions will be reached by the tagging activities of only very few users (less than 100). By applying the approach of power tags it is possible to separate the pivotal emotions of every document -if indeed there is any feeling at all. Originality/value -This paper is one of the first steps in the new research area of emotional information retrieval (EmIR). To the authors' knowledge, it is the first research project into the collective indexing of emotions in videos.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.