WARD, MARY KATHRINE. Using Virtual Presence and Survey Instructions to Minimize Careless Responding on Internet-Based Surveys. (Under the direction of Dr. Samuel B. Pond.) Internet-based survey data inform knowledge creation in research and justify work activities in many organizations. While there are decided advantages to Internet-based surveys, this mode of administration comes with its own set of challenges. Survey respondents may intentionally or unintentionally respond to the survey in a manner that does not accurately reflect their true beliefs or feelings. The purposes of this study were to examine two approaches that address the problem of careless responding (CR), and increase attentiveness among respondents with better survey design. This study investigated instructional manipulation and virtual human presence as potential buffers against CR. The sample consisted of undergraduate students who voluntarily completed an Internet-based survey. This study used a 3x3 between-subjects experimental design where virtual presence (absent, animated shape, and virtual human) and type of instruction (anonymous, warning, and feedback) were the independent variables. Indicators of CR were the dependent variables. Findings showed that warning instructions significantly reduced some forms of CR, but promising feedback had little effect on CR. The interaction of instructions and virtual presence had a significant effect on CR. Future research will need to tease apart the nuances of the relationship between instructions, virtual presence, and CR. The discussion includes implications for Internet-based survey administration and future directions for addressing the problem of CR.
Organizational citizenship behavior (OCB) is assessed by measuring how frequently employees display extra‐role and discretionary behaviors. One hundred forty‐four managerial employees responded to an OCB scale and indicated the number of behaviors on the scale they believed to be formally evaluated. None of the behaviors were believed to be unevaluated by all employees. Data suggest that a typical OCB scale is not measuring citizenship behaviors for everybody, and that OCB measurement needs refinement. Best prediction of other organizational variables was obtained when both the OCB and an index of “unevaluated” behaviors were used as predictors. Supervisor fairness interacted with OCB when predicting organizational commitment, and this interaction was contingent on the extent OCBs were believed to be unevaluated.
The construct validity of traditional assessment center dimensions was compared with that of a set of alternative constructs based on the functional structure of managerial work. Subjects were 75 middle‐level managers in state government who participated in two developmental assessment centers as part of a centralized management development program. One assessment center measured performance in terms of traditional attribute dimensions and the other in terms of functions performed in managerial work. Results show that evidence for construct validity is weak for both sets of constructs.
Understanding the processes through which trainee characteristics influence learning is important for identifying mechanisms that drive training effectiveness. We examine the direct and indirect paths through which core self-evaluations (CSE) impact learning. We also include general cognitive ability (GCA) to explore whether CSE's paths to effectiveness differ from those of a well-documented predictor of learning. We proposed a model in which CSE contributes to training effectiveness through its influence on motivational intervening mechanisms, and we tested this model empirically with military personnel (N = 638) who participated in job-required training. The data supported a partially mediated model. Irrespective of inclusion of GCA as a control variable, motivation and effort allocation (MEA) process variables (i.e., training motivation, midtraining self-efficacy, and midtraining goal setting) mediated (or partially mediated) the relationship between CSE and training outcomes that included affective (e.g., intentions to transfer), cognitive (e.g., declarative knowledge), and skill-based (e.g., proficiency) learning. Conversely, GCA had neither direct nor indirect effects on affective learning but did demonstrate direct effects on cognitive and skill-based learning. Results support the utility of including CSE in training research and practice, suggest that MEA serves as an explanatory mechanism for CSE's relation to learning outcomes, and demonstrate that CSE and GCA differentially influence training effectiveness and do so through different explanatory mechanisms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.