2012
DOI: 10.1371/journal.pone.0031362
|View full text |Cite
|
Sign up to set email alerts
|

Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment

Abstract: BackgroundComparative genomics, or the study of the relationships of genome structure and function across different species, offers a powerful tool for studying evolution, annotating genomes, and understanding the causes of various genetic disorders. However, aligning multiple sequences of DNA, an essential intermediate step for most types of analyses, is a difficult computational task. In parallel, citizen science, an approach that takes advantage of the fact that the human brain is exquisitely tuned to solvi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
158
1
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 199 publications
(173 citation statements)
references
References 40 publications
0
158
1
1
Order By: Relevance
“…Although we identified 21 articles, most of which used crowdsourcing successfully, crowdsourcing clearly is not used pervasively in health research, and it is important to understand the quality of data it provides. Even though the use of crowdsourcing in health research is in its infancy, the "NP" indicates that data were collected but not published "NP" indicates that data were collected but not published papers we identified successfully used crowdsourcing to solve protein structure problems, 18 improve alignment of promoter sequences, 23 track H1N1 influenza outbreaks in near real time, 14 classify colonic polyps, 27,28 and identify RBCs infected with Plasmodium falciparum parasites. [24][25][26] Furthermore, as Mavandadi et al point out, one way around the problem of involving lay people in making a medical diagnosis is to use crowdsourcing to distill the data for a medical professional, who can then make the final decision.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Although we identified 21 articles, most of which used crowdsourcing successfully, crowdsourcing clearly is not used pervasively in health research, and it is important to understand the quality of data it provides. Even though the use of crowdsourcing in health research is in its infancy, the "NP" indicates that data were collected but not published "NP" indicates that data were collected but not published papers we identified successfully used crowdsourcing to solve protein structure problems, 18 improve alignment of promoter sequences, 23 track H1N1 influenza outbreaks in near real time, 14 classify colonic polyps, 27,28 and identify RBCs infected with Plasmodium falciparum parasites. [24][25][26] Furthermore, as Mavandadi et al point out, one way around the problem of involving lay people in making a medical diagnosis is to use crowdsourcing to distill the data for a medical professional, who can then make the final decision.…”
Section: Discussionmentioning
confidence: 99%
“…12,13,[17][18][19]21 Also described was the online game Phylo, where users moved colored blocks representing different nucleotides of a gene promoter sequence around on screen in order to make the most parsimonious phylogenetic tree. 23 …”
Section: Problem Solvingmentioning
confidence: 99%
“…The smartphone age has given rise to the opportunity for "participatory sensing", allowing volunteers to record all types of phenomena using their mobile device [Restuccia et al 2016]. Relying on the ability of the human brain to complete tasks such as the analysis of images is quicker than even the most powerful of supercomputers [Kawrykow et al 2012]. Similar to SETI@home, VCS projects allow people to contribute without being a professional scientist.…”
Section: Introductionmentioning
confidence: 99%
“…For many, the tasks appear to be simple busywork, motivated by the idea that there will ultimately be some benefit to science at a higher level. However, Kowrykow et al (2012) refer to their participants as "game players", and perhaps this phrase also characterises those who enjoy transcribing label data. Thus, it makes sense that citizen science participants do not necessarily learn much about science, or natural history, in the process (Druschke and Seltzer 2012), despite early evidence that eBird participants do engage in scientific habits of thought (Trumbull et al 2000).…”
Section: Crowdsourcingmentioning
confidence: 99%