Teams are ubiquitous, woven into the fabric of engineering and design. Often, it is assumed that teams are better at solving problems than individuals working independently. Recent work in engineering, design, and psychology has indicated that teams may not be the problem-solving panacea that they were once thought to be. Crowdsourcing has seen increased interest in engineering design recently, and platforms often encourage teamwork between participants. This work undertakes an analysis of the performance of different team styles and sizes in crowdsourced competitions. This work demonstrates that groups of individuals working independently may outperform interacting teams on average, but that small interacting teams are more likely to win competitions. These results are discussed in the context of motivation for crowdsourcing participants.
The rapid digitalization of the world has affected engineering and design in a variety of ways, including the introduction of new computer-aided ideation tools. Cognitive assistants (CA), an increasingly common digital technology, use natural-language processing and artificial intelligence to provide computational support. Because cognitive assistants are capable of emulating humans in some tasks, they may be suited to support brainstorming activities when trained coaches or facilitators are not available. This study compared co-located brainstorming groups facilitated by human facilitators and a CA facilitator. Interaction Dynamics Notation was used to code the sessions, and Hidden Markov Models were used to define the group’s states. We found that human facilitation was associated with blocks/interruptions and responding to those while cognitive assistant facilitation was associated with deviations and silence. Human facilitation was also found to produce a more equal distribution of speaking time.
Background: During the onset of the COVID-19 crisis, universities rapidly pivoted to online formats and were often unable to adhere to the best practices of online learning highlighted in prior literature. It is well documented that a variety of barriers impeded "normal" educational practices.Purpose/Hypothesis: The purpose of this paper is to investigate the perceptions of first-year engineering students enrolled in an introductory engineering design course during the rapid transition to online working environments. We view students' perceptions through the theoretical lens of workplace thriving theory, a framework that allowed us to capture aspects of education required for students to thrive in non-optimum learning settings. Design/Method: This research employed semi-structured interview methods with 13 students enrolled in an introductory engineering design course that relies on project-based team learning. We analyzed interview transcripts using thematic analysis through an abductive approach and made interpretations through workplace thriving theory.Results: Results indicated that students' abilities to thrive are related to four intersecting themes that demonstrate how workplace thriving theory manifests in this unanticipated online setting. These themes demonstrate elements that must be optimized for students to thrive in settings such as this: relationships with others, building and sharing knowledge through interactions, perceptions of experiential learning, and individual behaviors. Conclusion:Our research, viewed through workplace thriving theory, highlights the mechanisms by which students tried to succeed in suboptimal environments.While not all our participants showed evidence of thriving, the factors required for thriving point to opportunities to harness these same factors in in-person instruction environments.
PurposeOften, it is assumed that teams are better at solving problems than individuals working independently. However, recent work in engineering, design and psychology contradicts this assumption. This study aims to examine the behavior of teams engaged in data science competitions. Crowdsourced competitions have seen increased use for software development and data science, and platforms often encourage teamwork between participants.Design/methodology/approachWe specifically examine the teams participating in data science competitions hosted by Kaggle. We analyze the data provided by Kaggle to compare the effect of team size and interaction frequency on team performance. We also contextualize these results through a semantic analysis.FindingsThis work demonstrates that groups of individuals working independently may outperform interacting teams on average, but that small, interacting teams are more likely to win competitions. The semantic analysis revealed differences in forum participation, verb usage and pronoun usage when comparing top- and bottom-performing teams.Research limitations/implicationsThese results reveal a perplexing tension that must be explored further: true teams may experience better performance with higher cohesion, but nominal teams may perform even better on average with essentially no cohesion. Limitations of this research include not factoring in team member experience level and reliance on extant data.Originality/valueThese results are potentially of use to designers of crowdsourced data science competitions as well as managers and contributors to distributed software development projects.
Cognitive assistants use vocal interfaces and artificial intelligence to assist humans with complex tasks. While much research has focused on the application of these devices, a few studies have addressed how these devices affect the way humans work. To fill this gap, this research studied the effects of a cognitive assistant on mental workload, frustration, and effort. Participants worked with a Wizard-of-Oz style assistant and completed the Wisconsin Card-Sorting Task and engaged in a peripheral-detection task in a two-sample study that compared participants (n = 21) who worked with the assistant to those who did not. Follow-up interviews were also completed. Results suggest that onboarding techniques, such as tutorials, are important for developing analogical trust before regular use. Additionally, results suggest that keeping the mental model of the CA clear, simple, and intuitive is important to reduce the mental effort that is required to account for the CA and interactions with it while working. Cognitive assistants offer a broad range of advantages but also have distinct challenges for users: primarily the lack of physical affordances that can be linked to functionality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.