2013
DOI: 10.1002/asi.22839
|View full text |Cite
|
Sign up to set email alerts
|

Behavioral changes in transmuting multisession successive searches over the web

Abstract: Multisession successive information searches are common but little research has focused on quantitative analysis. This article enhances our understanding of successive information searches by employing an experimental method to observe whether and how the behavioral characteristics of searchers statistically significantly changed over sessions. It focuses on a specific type of successive search called transmuting successive searches, in which searchers learn about and gradually refine their information problem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
4
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 82 publications
(169 reference statements)
1
4
0
Order By: Relevance
“…Our findings confirm and extend results from previous studies showing that CSWTs often involve complex topics [31] and multiple sub-goals [2]. Our findings further suggest that features of real-world CSWTs such as task outcome type and cognitive complexity could have impacts on cross-session search behaviors and users' needs [27,30]. For example, users engaged in understand-level cross-session tasks may benefit from tools to help reacquainting with context, whereas create-level tasks may involve multiple sub-goals with differing needs across sessions [53].…”
Section: Discussionsupporting
confidence: 89%
See 1 more Smart Citation
“…Our findings confirm and extend results from previous studies showing that CSWTs often involve complex topics [31] and multiple sub-goals [2]. Our findings further suggest that features of real-world CSWTs such as task outcome type and cognitive complexity could have impacts on cross-session search behaviors and users' needs [27,30]. For example, users engaged in understand-level cross-session tasks may benefit from tools to help reacquainting with context, whereas create-level tasks may involve multiple sub-goals with differing needs across sessions [53].…”
Section: Discussionsupporting
confidence: 89%
“…In their model of Multiple Information Seeking Episodes (MISE), Lin and Belkin [29] outlined reasons that a searcher might renew a search task from a previous session: transmuting, spawning, transiting, rolling back, lost-treatment, unanswered, cultivated, and anticipated. In later work, Lin [28] and Lin and Xie [27] partially validated aspects of the MISE model. However, to the best of our knowledge, our study is the first to investigate how these reasons manifest in real-world, everyday life CSWTs.…”
Section: Discussionmentioning
confidence: 99%
“…The main hypothesis of those works is that past queries effectively reflect the real interests of the user and, hence, they can be used to build suitable user profiles and to create utility‐preserving (with regards to user interests) fake queries. However, we believe that this assumption does not always hold because of the following points that may introduce bias and noise to the resulting user profiles: Because of external and circumstantial needs (e.g., a student doing her homework), users may submit queries related to certain topics that are quite far from their real interests. Many users, especially those who are not very familiar with the internals of web information retrieval or WSE query languages, may submit quite inaccurate queries to the WSE to retrieve more suitable suggestions (i.e., query refinement [Shi & Yang, ]), or may be forced to reformulate their queries several times to retrieve more appropriate results (Lin & Xie, ). These tryouts or trial‐and‐error interactions add an undesirable bias to the generated profiles, which may artificially favor the topics associated to these recurrent queries. It might happen that several different users share the same computer, IP or web browser to submit their queries to the WSE.…”
Section: Previous Workmentioning
confidence: 99%
“…• Because of external and circumstantial needs (e.g., a student doing her homework), users may submit queries related to certain topics that are quite far from their real interests. • Many users, especially those who are not very familiar with the internals of web information retrieval or WSE query languages, may submit quite inaccurate queries to the WSE to retrieve more suitable suggestions (i.e., query refinement [Shi & Yang, 2007]), or may be forced to reformulate their queries several times to retrieve more appropriate results (Lin & Xie, 2013). These tryouts or trial-and-error interactions add an undesirable bias to the generated profiles, which may artificially favor the topics associated to these recurrent queries.…”
Section: Previous Workmentioning
confidence: 99%
“…For example, in evaluating the effectiveness of a technique of eliciting more robust terms from user information need descriptions [12], the results showed that additional information from users significantly improve retrieval performance. In a study of successive searches that considers the evolution of user information problems, it was found that behavioural characteristics of searches (e.g., the number of unique pages visited) can differentiate stages of successive search [15]. In contrast to the IR system design approach, the capturing and analysis of search contexts by the IIR approach have been achieved by direct observations of user search behaviours in controlled user experiment settings.…”
Section: Iir Approachmentioning
confidence: 99%