Content area

Abstract

Recent developments in the area of computer-automated scoring (CAS) systems offer a promising alternative to the use of human experts in the scoring of responses to open-ended questions on structured employment interviews. Despite the advances in the field, additional research has been needed to evaluate the quality of scores generated by modern CAS systems and the factors affecting CAS system performance. This study is an investigation of the impact of item type, scoring rubric, and training data set on the quality of scores generated by three different CAS systems on a sixty-item structured employment interview developed by a prominent testing organization. The degree of agreement between the scores given by human experts and CAS systems on each of the sixty items in the interview was used as the measure of the quality of CAS generated scores. It was first hypothesized that the quality of CAS generated scores would be higher on less complex item types. While significant differences in the quality of CAS scores were found as a function of item type, the pattern of differences did not support this hypothesis. Second, it was hypothesized that the quality of CAS scores would be higher on items for which human experts used less complex scoring rubrics to evaluate responses of job applicants. The results found no evidence of the hypothesized relationship. Finally, it was hypothesized that the quality of CAS scores would be higher on items for which the training data set included a comparable number of responses in each of the two possible score categories. The results supported this hypothesis for two of the three systems studied. The findings of this study provide insights into issues and factors that should be considered in future research on CAS systems.

Details

Title
Computer -automated scoring systems and structured employment interviews: An examination of the impact of item type, scoring rubric complexity, and training data on the quality of scores
Author
Juszkiewicz, Piotr J.
Year
2004
Publisher
ProQuest Dissertations Publishing
ISBN
978-0-496-92680-0
Source type
Dissertation or Thesis
Language of publication
English
ProQuest document ID
305161585
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.