Peer and self-assessment offer an opportunity to scale both assessment and learning to global classrooms. This article reports our experiences with two iterations of the first large online class to use peer and selfassessment. In this class, peer grades correlated highly with staff-assigned grades. The second iteration had 42.9% of students' grades within 5% of the staff grade, and 65.5% within 10%. On average, students assessed their work 7% higher than staff did. Students also rated peers' work from their own country 3.6% higher than those from elsewhere. We performed three experiments to improve grading accuracy. We found that giving students feedback about their grading bias increased subsequent accuracy. We introduce short, customizable feedback snippets that cover common issues with assignments, providing students more qualitative peer feedback. Finally, we introduce a data-driven approach that highlights high-variance items for improvement. We find that rubrics that use a parallel sentence structure, unambiguous wording, and well-specified dimensions have lower variance. After revising rubrics, median grading error decreased from 12.4% to 9.9%.
Micro-task platforms provide massively parallel, ondemand labor. However, it can be difficult to reliably achieve high-quality work because online workers may behave irresponsibly, misunderstand the task, or lack necessary skills. This paper investigates whether timely, taskspecific feedback helps crowd workers learn, persevere, and produce better results. We investigate this question through Shepherd, a feedback system for crowdsourced work. In a between-subjects study with three conditions, crowd workers wrote consumer reviews for six products they own. Participants in the None condition received no immediate feedback, consistent with most current crowdsourcing practices. Participants in the Self-assessment condition judged their own work. Participants in the External assessment condition received expert feedback. Self-assessment alone yielded better overall work than the None condition and helped workers improve over time. External assessment also yielded these benefits. Participants who received external assessment also revised their work more. We conclude by discussing interaction and infrastructure approaches for integrating real-time assessment into online work.
Prototyping is the pivotal activity that structures innovation, collaboration, and creativity in design. Prototypes embody design hypotheses and enable designers to test them. Framing design as a thinking-by-doing activity foregrounds iteration as a central concern. This paper presents d.tools, a toolkit that embodies an iterative-design-centered approach to prototyping information appliances. This work offers contributions in three areas. First, d.tools introduces a statechart-based visual design tool that provides a low threshold for early-stage prototyping, extensible through code for higher-fidelity prototypes. Second, our research introduces three important types of hardware extensibility -at the hardware-to-PC interface, the intra-hardware communication level, and the circuit level. Third, d.tools integrates design, test, and analysis of information appliances. We have evaluated d.tools through three studies: a laboratory study with thirteen participants; rebuilding prototypes of existing and emerging devices; and by observing seven student teams who built prototypes with d.tools.
Designers often use examples for inspiration; examples offer contextualized instances of how form and content integrate. Can interactive example galleries bring this practice to everyday users doing design work, and does working with examples help the designs they create? This paper explores whether people can realize significant value from explicit mechanisms for designing by example modification. We present the results of three studies, finding that independent raters prefer designs created with the aid of examples, that users prefer adaptively selected examples to random ones, and that users make use of multiple examples when creating new designs. To enable these studies and demonstrate how software tools can facilitate designing with examples, we introduce interface techniques for browsing and borrowing from a corpus of examples, manifest in the Adaptive Ideas Web design tool. Adaptive Ideas leverages a faceted metadata interface for viewing and navigating example galleries.
Sensors are becoming increasingly important in interaction design. Authoring a sensor-based interaction comprises three steps: choosing and connecting the appropriate hardware, creating application logic, and specifying the relationship between sensor values and application logic. Recent research has successfully addressed the first two issues. However, linking sensor input data to application logic remains an exercise in patience and trial-and-error testing for most designers. This paper introduces techniques for authoring sensor-based interactions by demonstration. A combination of direct manipulation and pattern recognition techniques enables designers to control how demonstrated examples are generalized to interaction rules. This approach emphasizes design exploration by enabling very rapid iterative demonstrate-edit-review cycles. This paper describes the manifestation of these techniques in a design tool, Exemplar, and presents evaluations through a first-use lab study and a theoretical analysis using the Cognitive Dimensions of Notation framework.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.