In this work we present topic diversification, a novel method designed to balance and diversify personalized recommendation lists in order to reflect the user's complete spectrum of interests. Though being detrimental to average accuracy, we show that our method improves user satisfaction with recommendation lists, in particular for lists generated using the common item-based collaborative filtering algorithm.Our work builds upon prior research on recommender systems, looking at properties of recommendation lists as entities in their own right rather than specifically focusing on the accuracy of individual recommendations. We introduce the intra-list similarity metric to assess the topical diversity of recommendation lists and the topic diversification approach for decreasing the intra-list similarity. We evaluate our method using book recommendation data, including offline analysis on 361, 349 ratings and an online study involving more than 2, 100 subjects.
As users continue offloading more control and responsibility to the computer, coordinating the asynchronous interactions between the user and computer is becoming increasingly important. Without proper coordination, an application attempting to gain the user's attention risks interrupting the user in the midst of performing another task. To justify why an application should avoid interrupting the user whenever possible, we designed an experiment measuring the disruptive effect of an interruption on a user's task performance. The experiment utilized six web-based task categories and two categories of interruption tasks. The results of the experiment demonstrate that (i) a user performs slower on an interrupted task than a non-interrupted task, (ii) the disruptive effect of an interruption differs as a function of task category, and (iii) different interruption tasks cause similar disruptive effects on task performance. These results empirically validate the need to better coordinate user interactions among applications that are competing for the user's attention.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.