We present a case study on factors that affect the usability and User Experience (UX) of a crowdsourcing platform for argumentation. While this particular system is focused on argumentation, we have aimed to abstract our findings such that they are generalizable to other use cases so that others may consider these lessons learned and recommendations for the development of more effective crowdsourcing and collaborative work platforms. Several themes were identified in participant responses about the usability and UX of the crowdsourcing system, including a desire for less structure, the need for additional training, the desire for a streamlined workflow and UI, improved navigation, and the enjoyment of interactions with other users.
Task Analysis (TA) represents not one standard method, but rather a toolkit of methods that come with a large and complex range of outputs. Due to this fluidity in methods and resultant data, a standard method of analysis and visual data representation does not currently exist. Depending on the research methods used and desired outcomes, some visualization methods may be more appropriate than others. This body of work seeks to demonstrate the need for the establishment of a graphical grammar in this domain. Initial recommendations are provided for visualizing different task analysis methods. This research lays the groundwork towards the development of a full set of visualization best practices to allow practitioners to gain the most insight from their TA methods of choice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.