Literaturnachweis - Detailanzeige
Autor/inn/en | Alahmadi, Sarah; Jones, Andrew T.; Barry, Carol L.; Ibáñez, Beatriz |
---|---|
Titel | Comparing Drift Detection Methods for Accurate Rasch Equating in Different Sample Sizes |
Quelle | In: Applied Measurement in Education, 36 (2023) 2, S.157-170 (14 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Zusatzinformation | ORCID (Alahmadi, Sarah) |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0895-7347 |
DOI | 10.1080/08957347.2023.2201704 |
Schlagwörter | Equated Scores; Item Response Theory; Sample Size; Test Items; Statistical Analysis; Computation; Classification; Accuracy; High Stakes Tests; Licensing Examinations (Professions) |
Abstract | Rasch common-item equating is often used in high-stakes testing to maintain equivalent passing standards across test administrations. If unaddressed, item parameter drift poses a major threat to the accuracy of Rasch common-item equating. We compared the performance of well-established and newly developed drift detection methods in small and large sample sizes, varying the proportion of test items used as anchor (common) items and the proportion of drifted anchors. In the simulated-data study, the most accurate equating was obtained in large-sample conditions with a small-moderate number of drifted anchors using the mINFIT/mOUTFIT methods. However, when any drift was present in small-sample conditions and when a large number of drifted anchors were present in large-sample conditions, all methods performed ineffectively. In the operational-data study, percent-correct standards and failure rates varied across the methods in the large-sample exam but not in the small-sample exam. Different recommendations for high- and low-volume testing programs are provided. (As Provided). |
Anmerkungen | Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |