Literaturnachweis - Detailanzeige
Autor/in | Cantrell, Catherine E. |
---|---|
Titel | Item Response Theory: Understanding the One-Parameter Rasch Model. |
Quelle | (1997), (42 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Monographie |
Schlagwörter | Quantitative Daten; Ability; Difficulty Level; Estimation (Mathematics); Item Response Theory; Mathematical Models; Prediction; Sampling; Tables (Data) |
Abstract | This paper discusses the limitations of Classical Test Theory, the purpose of Item Response Theory/Latent Trait Measurement models, and the step-by-step calculations in the Rasch measurement model. The paper explains how Item Response Theory (IRT) transforms person abilities and item difficulties into the same metric for test-independent and sample-independent comparisons. IRT is based on two postulates. First, the performance of an examinee on a test item can be predicted by a set of factors called traits. Second, the relationship between examinee test performance and the set of traits underlying the performance can be defined by an item characteristic curve. There are three models in IRT. The three-parameter model is made up of item discrimination, item difficulty, and guessing parameters. In the one-parameter Rasch model, the guessing and item discrimination parameters are considered negligible. This model is used to analyze differences in test scores that initially are not linear. In this study, a regression analysis was performed to find a correlation between Rasch calibrations and classical measurement calibrations and to plot a scatterplot of the two measures with their regression line. These results challenge the idea that Rasch latent trait measurement is superior to classical measurement because its estimates are item-free and sample-free. High correlations between the two measures indicate that Rasch calibrations are not truly item- or sample-free or that the classical model calibrations are equally item- or sample-free. (Contains 12 tables, 6 figures, and 5 references.) (SLD) |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |