Be Careful What You Ask for : Effects of Response Instructions on the Construct Validity and Reliability of Situational Judgment Tests
The aim of this study was to examine how six different types of situational judgment test (SJT) instructions, used frequently in practice, influence the psychometric characteristics of SJTs. The six SJT versions used the exact same items and differed only in their instructions; these versions were administered in two phases. Phase I was a between-subjects design (=486) that had participants complete one version of the SJTs. Phase II was a within-subjects design (=231) held several weeks later that had participants complete all six versions of the SJTs. Further, 146 of these individuals completed both phases, allowing for an assessment of test-retest reliability. A variety of objective and subjective criteria were collected, including self and peer ratings. Results indicated that instructions had a large effect on SJT responses, reliability, and validity. In general, instructions asking what one 'would do' showed more favorable characteristics than those that asked what one 'should do'. Correlations between these two types were relatively low despite the fact that the same items were used, and criterion-related validities differed substantially in favor of the 'would do' instructions. Overall, this study finds that researchers and practitioners must give careful consideration to the types of SJT instructions used; failing to do so could influence criterion-related validity and cloud inferences of construct validity
Year of publication: |
2004
|
---|---|
Authors: | Ployhart, Robert E. ; Ehrhart, Mark G. |
Publisher: |
[S.l.] : SSRN |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Ployhart, Robert E., (2003)
-
Leader–member exchange and organizational climate effects on clinician turnover intentions
Aarons, Gregory A., (2020)
-
Can test preparation help to reduce the black-white test performance gap?
Chung-Herrera, Beth G., (2009)
- More ...