Literaturnachweis - Detailanzeige
Autor/in | Wise, Lauress |
---|---|
Institution | Defense Manpower Data Center, Monterey, CA. |
Titel | Scoring Rubrics for Performance Tests: Lessons Learned from Job Performance Assessment in the Military. |
Quelle | (1993), (16 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Monographie |
Schlagwörter | Educational Assessment; Educational Research; Evaluation Methods; Generalizability Theory; Industrial Psychology; Job Performance; Military Personnel; Occupational Tests; Organizational Development; Performance Based Assessment; Performance Tests; Personnel Evaluation; Personnel Selection; Scoring Rubrics; Standards; Test Construction; Training Education; assessment; Bewertungssystem; Bildungsforschung; Pädagogische Forschung; Betriebspsychologie; Industriepsychologie; Work performance; Arbeitsleistung; Berufseignungsprüfung; Organisationsentwicklung; Leistungsermittlung; Leistungsbeurteilung; Leistungsmessung; Leistungsüberprüfung; Personalbeurteilung; Personalauswahl; Personalentscheidung; Scoring formulas; Auswertungsbogen; Standard; Testaufbau; Ausbildung |
Abstract | Industrial and organizational psychologists for the Department of Defense have been working for the past 10 years to develop high fidelity measures of job performance for use in validating job selection procedures and standards. Information on developing and scoring performance exercises in the Job Performance Measurement (JPM) Project is presented, and lessons that might be useful in education are extracted. In many ways, the task of the industrial psychologist is easier than that of the educator because of broader agreement about how the task should be performed and close alignment between training and expected performance. Tasks identified by each Armed Service were analyzed, and scoring rules were developed. The following lessons seem especially pertinent to educational assessment: (1) careful specification of the domains assessed is essential for evaluating the adequacy of any sample selected; (2) scoring elements that assess adherence to processes that are taught will have better diagnostic value (and possibly greater validity) than will those that just reflect the quality of output; (3) scoring procedures must be anchored to observable criteria; and (4) generalizability theory provides a useful framework for evaluating alternative scoring rubrics. One table lists the JPM occupational specialties, and two figures illustrate the discussion. An attachment summarizes the lessons to be learned. (SLD) |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |