This study compared the original intrinsic and extrinsic subscales of the Minnesota Satisfaction Questionnaire short form to revised subscales using data from two samples. The revised subscales were formed according to critiques by several researchers. Confirmatory factor analysis of the original and revised subscales supported the discriminant validity of scores on the intrinsic and extrinsic job satisfaction measures. Several hierarchical regression models were tested that included job involvement, overall job satisfaction, and volitional absence variables, in addition to the job satisfaction components. The analyses from both samples indicated that revising the intrinsic and extrinsic subscales made little difference in the results obtained.
Multilevel researchers have predominantly applied either direct consensus or referent-shift consensus composition models when aggregating individual-level data to a higher level of analysis. This prevailing focus neglects both theory and empirical evidence, suggesting that the variance of group members' responses may complement the absolute mean level of group members' judgments. The goals of this article are to demonstrate the application of dispersion-composition models for capturing variability among group members' collective judgments and highlight the statistical challenges (and inherent constraints) of using group means and variances as predictors of study criteria. To this end, the authors present and illustrate a six-step sequential framework for applying dispersioncomposition models using data from two independent field samples. The authors contend that the application of dispersion-composition models not only will strengthen a study's conclusions by eliminating potential rival data interpretations but may also shed new light on past findings, potentially opening new doors to a more complete understanding of multilevel phenomena.
The authors explored the idea that teams consisting of members who, on average, demonstrate greater mastery of relevant teamwork knowledge will demonstrate greater task proficiency and observed teamwork effectiveness. In particular, the authors posited that team members' mastery of designated teamwork knowledge predicts better team task proficiency and higher observer ratings of effective teamwork, even while controlling for team task proficiency. The authors investigated these hypotheses by developing a structural model and testing it with field data from 92 teams (1,158 team members) in a United States Air Force officer development program focusing on a transportable set of teamwork competencies. The authors obtained proficiency scores on 3 different types of team tasks as well as ratings of effective teamwork from observers. The empirical model supported the authors' hypotheses.
We explored whether voluntary survey completion by team members (in aggregate) is predictable from team members' collective evaluations of team-emergent states. In doing so, we reanalyze less-than-complete survey data on 110 teams from a published field study, using so-called traditional and modern missing data techniques to probe the sensitivity of these team-level relationships to data missingness. The multivariate findings revealed that a greater within-team participation rate was indeed related to a higher team-level (mean) score on team mental efficacy (across all four missing-data techniques) and less dispersion among team member judgments about internal cohesion (when the 2 modern methods were used). In addition, results show that a commonly used approach of retaining only those teams with high participation rates produces inflated standardized effect size (i.e., R²) estimates and decreased statistical power. Suggestions include research design considerations and a comprehensive methodology to account for team member data missingness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.