Online question-answering (Q&A) services are becoming increasingly popular among information seekers. While online Q&A services encompass both virtual reference service (VRS) and social Q&A (SQA), SQA services, such as Yahoo! Answers and WikiAnswers, have experienced more success in reaching out to the masses and leveraging subsequent participation. However, the large volume of content on some of the more popular SQA sites renders participants unable to answer some posted questions adequately or even at all. To reduce this latter category of questions that do not receive an answer, the current paper explores reasons for why fact-based questions fail on a specific Q&A service. For this exploration and analyses, thousands of failed questions were collected from Yahoo! Answers extracting only those that were fact-based, information-seeking questions, while opinion/adviceseeking questions were discarded. A typology was then created to code reasons of failures for these questions using a grounded theory approach. Using this typology, suggestions are proposed for how the questions could be restructured or redirected to another Q&A service (possibly a VRS), so users would have a better chance of receiving an answer.
Online Q&A services are information sources where people identify their information need, formulate the need in natural language, and interact with one another to satisfy their needs. Even though in recent years online Q&A has considerably grown in popularity and impacted information‐seeking behaviors, we still lack knowledge about what motivates people to ask a question in online Q&A environments. Yahoo! Answers and WikiAnswers were selected as the test beds in the study, and a sequential mixed method employing an Internet‐based survey, a diary method, and interviews was used to investigate user motivations for asking a question in online Q&A services. Cognitive needs were found as the most significant motivation, driving people to ask a question. Yet, it was found that other motivational factors (e.g., tension free needs) also played an important role in user motivations for asking a question, depending on asker's contexts and situations. Understanding motivations for asking a question could provide a general framework of conceptualizing different contexts and situations of information needs in online Q&A. The findings have several implications not only for developing better question‐answering processes in online Q&A environments, but also for gaining insights into the broader understanding of online information‐seeking behaviors.
Although online Q&A services have increased in popularity, the field lacks a comprehensive typology to classify different kinds of services into model types. This poster categorizes online Q&A services into four model types -community-based, collaborative, expert-based, and social. Drawing such a distinction between online Q&A models provides an overview for how these different types of online Q&A models differ from each other and suggests implications for mitigating weaknesses and bolstering strengths of each model based on the types of questions that are addressed within each. To demonstrate differences among these models an appropriate service was selected for each of them. Then, 500 questions were collected and analyzed for each of these services to classify question types into four categories -information-seeking, adviceseeking, opinion-seeking, and non-information seeking. The findings suggest that information-seeking questions appear to be more suitable in either a collaborative Q&A environment or an expert-based Q&A environment, while opinion-seeking questions are more common in community-based Q&A. Social Q&A, on the other hand, provides an active forum for either seeking personal advice or seeking non-information related to either self-expression or self-promotion.
Social Q&A (SQA) has rapidly grown in popularity, impacting people’s information seeking behaviors. Although previous research has examined how people seek and share information within SQA, fundamental questions of user motivations and expectations for information seeking remain. This paper applies the theoretical framework of uses and gratification theory to investigate the motivations for SQA use, and adapts relevance criteria from library and information science (LIS) literature to investigate expectations with regard to evaluation of content within SQA site Yahoo! Answers. A total of 75 Yahoo! Answers users participated in a survey that asked them about reasons, methods, and expectations relating to asking questions within the SQA site. Findings indicate the importance of motivations and expectations in fulfilling both cognitive and socio–affective needs based upon the type of task performed. These findings provide encouraging evidence that understanding the interrelationship between the motivations behind and expectations of asking a question can inform a more holistic framework to assess information.
With the advent of ubiquitous connectivity and a constant flux of user-generated content, people’s online information-seeking behaviours are rapidly changing, one o f which includes seeking information from peers through online questioning. Ways to understand this new behaviour can be broken down into three aspects, also referred to as the three M’s – the modalities (sources and strategies) that people use when asking their questions online, their motivations behind asking these questions and choosing specific services, and the types and quality of the materials (content) generated in such an online Q&A environment. This article will provide a new framework – three M’s – based on the synthesis of relevant literature. It will then identify some of the gaps in our knowledge about online Q&A based on this framework. These gaps will be transformed into six research questions, stemming from the three M’s, and addressed by (a) consolidating and synthesizing findings previously reported in the literature, (b) conducting new analyses of data used in prior work, and (c) administering a new study to answer questions unaddressed by the pre-existing and new analyses of prior work.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.