2014
DOI: 10.1609/icwsm.v8i1.14529
|View full text |Cite
|
Sign up to set email alerts
|

Great Question! Question Quality in Community Q&A

Abstract: Asking the right question in the right way is an art (and a science). In a community question-answering setting, a good question is not just one that is found to be useful by other people: a question is good if it is also presented clearly and shows prior research.  Using a community question-answering site that allows voting over the questions, we show that there is a notion of question quality that goes beyond mere popularity. We present techniques using latent topic models to automatically predict the quali… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
13
0
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(15 citation statements)
references
References 20 publications
1
13
0
1
Order By: Relevance
“…This gives us good ground to understand the issues with unanswered questions. On the contrary, 99.4% of questions studied in (Ravi et al 2014) and 95% of questions in (Shah and Pomerantz 2010) received at least one answer, often involving multiple responders. • The notion of quality is natural and intuitive in our dataset, where a single responder handpicks a few questions that he/she wishes to answer.…”
Section: Datasetmentioning
confidence: 98%
See 3 more Smart Citations
“…This gives us good ground to understand the issues with unanswered questions. On the contrary, 99.4% of questions studied in (Ravi et al 2014) and 95% of questions in (Shah and Pomerantz 2010) received at least one answer, often involving multiple responders. • The notion of quality is natural and intuitive in our dataset, where a single responder handpicks a few questions that he/she wishes to answer.…”
Section: Datasetmentioning
confidence: 98%
“…Prior research on Community Question Answering (CQA) has addressed issues as diverse as predicting whether a particular answer will be chosen by the inquirer or not (Adamic et al 2008), predicting answer quality (Shah and Pomerantz 2010;Jeon et al 2006) and quantity (Dror, Maarek, and Szpektor 2013), understanding question type (Allamanis and Sutton 2013) and quality (Ravi et al 2014;Yang et al 2011;Li et al 2012), analyzing inquirer's satisfaction (Liu, Bian, and Agichtein 2008) and responders' motivation, finding similar questions (Li and Manandhar 2011) and expert potential responders (Li and King 2010). Although the aforementioned studies are helpful, we are more curious about the factors of a question that are more likely to generate a response, especially in the MISR setting where there is just one responder but multiple inquirers.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…In this paper, we also analyze Genius in the context of other well-studied crowdsourced information sites, such as Stack Overflow (Ravi et al 2014;Posnett et al 2012;Tian, Zhang, and Li 2013), Quora (Wang et al 2013;Maity, Sahni, and Mukherjee 2015), Yahoo! Answers (Adamic et al 2008), Amazon (Bai et al 2018), and Wikipedia (Beschastnikh, Kriplean, and McDonald 2008;Mesgari et al 2015).…”
Section: Additional Related Workmentioning
confidence: 99%