This paper presents an overview of a useful approach for theory testing in the social sciences that combines the principles of psychometric meta-analysis and structural equations modeling. In this approach to theory testing, the estimated true score correlations between the constructs of interest are established through the application of metaanalysis (Hunter & Schmidt, 1990), and structural equations modeling is then applied to the matrix of estimated true score correlations. The potential advantages and limitations of this approach are presented. The approach enables researchers to test complex theories involving several constructs that cannot all be measured in a single study. Decision points are identified, the options available to a researcher are enumerated, and the potential problems as well as the prospects of each are discussed.Over the years the importance of theory testing has been increasingly emphasized (e.g., Campbell, 1990;Schmidt, 1992;Schmitt & Landy, 1993). This is consistent with the prediction of Schmidt and Kaplan (1971) that as a nascent field matures, scientists unencumbered by the need to constantly prove the value of their profession to the general society and in the pantheon of sciences, devote more attention to the explanation of the processes underlying the observed relationships and engage more frequently in explicitly articulating the theories that guide their practice. To explicate the underlying processes and theories which Both authors contributed equally; order of authorship is arbitrary. An earlier version of this paper was presented in J. S. Phillips (Chair), Someproblems and innovative solutions in structural equations modeling used for management theory building. Symposium conducted at the 54th annual meeting of the Academy of Management, Dallas, TX. We thank Frank Schmidt for his collaboration on an earlier manuscript. We also acknowledge Jack Hunter for his pioneering work on combining meta-analysis and path analysis. We thankfully acknowledge the contributions of three anonymous reviewers; this manuscript has greatly benefited from all their extensive comments.Correspondence and requests for reprints should be addressed to Chockalingam Viswesvaran,
The authors conducted a comprehensive meta-analysis based on 665 validity coefficients across 576,460 data points to investigate whether integrity test validities are generalizable and to estimate differences in validity due to potential moderating influences. Results indicate that integrity test validities are substantial for predicting job performance and counterproductive behaviors on the job, such as theft, disciplinary problems, and absenteeism. The estimated mean operational predictive validity of integrity tests for predicting supervisory ratings of job performance is .41. Results from predictive validity studies conducted on applicants and using external criterion measures (i.e., excluding self-reports) indicate that integrity tests predict the broad criterion of organizationally disruptive behaviors better than they predict employee theft alone. Despite the influence of moderators, integrity test validities are positive across situations and settings.Over the last 10 years, interest in and use of integrity testing has increased substantially. The publication of a series of literature reviews attests to the interest in this area and its dynamic nature (
This study used meta-analytic methods to compare the interrater and intrarater reliabilities of ratings of 10 dimensions of job performance used in the literature; ratings of overall job performance were also examined. There was mixed support for the notion that some dimensions are rated more reliably than others. Supervisory ratings appear to have higher interrater reliability than peer ratings. Consistent with H. R. Rothstein (1990), mean interrater reliability of supervisory ratings of overall job performance was found to be .52. In all cases, interrater reliability is lower than intrarater reliability, indicating that the inappropriate use of intrarater reliability estimates to correct for biases from measurement error leads to biased research results. These findings have important implications for both research and practice.Several measures of job performance have been used over the years as criterion measures (cf.
Response bias continues to be the most frequently cited criticism of personality testing for personnel selection. The authors meta-analyzed the social desirability literature, examining whether social desirability functions as a predictor for a variety of criteria, as a suppressor, or as a mediator. Social desirability scales were found not to predict school success, task performance, counterproductive behaviors, and job performance. Correlations with the Big Five personality dimensions, cognitive ability, and years of education are presented along with empirical evidence that (a) social desirability is not as pervasive a problem as has been anticipated by industrial-organizational psychologists, (b) social desirability is in fact related to real individual differences in emotional stability and conscientiousness, and (c) social desirability does not function as a predictor, as a practically useful suppressor, or as a mediator variable for the criterion of job performance. Removing the effects of social desirability from the Big Five dimensions of personality leaves the criterion-related validity of personality constructs for predicting job performance intact.A major concern of many industrial-organizational psychologists in using personality inventories in applied personnel selection settings has been the potential for response distortion (Hogan & Nicholson, 1988;Nunnally,
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.