Literaturnachweis - Detailanzeige
Autor/inn/en | Zwick, Rebecca; und weitere |
---|---|
Institution | Educational Testing Service, Princeton, NJ. |
Titel | Assessing Differential Item Functioning in Performance Tests. |
Quelle | (1993), (45 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Monographie |
Schlagwörter | Educational Assessment; Item Bias; Multiple Choice Tests; Performance Based Assessment; Test Items; Test Reliability; Test Validity |
Abstract | Although the belief has been expressed that performance assessments are intrinsically more fair than multiple-choice measures, some forms of performance assessment may in fact be more likely than conventional tests to tap construct-irrelevant factors. As performance assessment grows in popularity, it will be increasingly important to monitor the validity and fairness of alternative item types. The assessment of differential item functioning (DIF), as one component of this evaluation, can be helpful in investigating the effect on subpopulations of the introduction of performance tasks. Developing a DIF analysis strategy for performance measures requires decisions as to how the matching variable should be defined and how the analysis procedure should accommodate polytomous responses. In this study, two inferential procedures, extensions of the Mantel-Haenszel procedure, and two types of descriptive summaries that may be useful in assessing DIF in performance measures were explored and applied to simulated data. All the investigated statistical methods appear to be worthy of further study. Nine tables present analysis results. (Contains 32 references.) (Author/SLD) |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |