Literaturnachweis - Detailanzeige
Autor/in | Zhang, Zhonghua |
---|---|
Titel | Estimating Standard Errors of IRT True Score Equating Coefficients Using Imputed Item Parameters |
Quelle | In: Journal of Experimental Education, 90 (2022) 3, S.760-782 (23 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0022-0973 |
DOI | 10.1080/00220973.2020.1751579 |
Schlagwörter | Error of Measurement; Item Response Theory; True Scores; Equated Scores; Data Analysis; Error Patterns; Item Analysis; Measurement Techniques |
Abstract | Reporting standard errors of equating has been advocated as a standard practice when conducting test equating. The two most widely applied procedures for standard errors of equating including the bootstrap method and the delta method are either computationally intensive or confined to the derivations of complicated formulas. In the current study, a hypothetical example was used to illustrate how the multiple imputation method could be taken as an alternative procedure for obtaining the standard errors for the item response theory (IRT) true score equating coefficients in the context of the common-item nonequivalent groups equating design under the three-parameter logistic IRT model. This method makes use of multiple sets of imputed item parameter values. By using the simulated and real data, the performance of the multiple imputation method was examined and compared with that of the bootstrap and delta methods. The results indicated that the multiple imputation method performed as effectively as the bootstrap method and the delta method when using the characteristic curve methods. The multiple imputation method produced very similar results to the delta method when the moment methods were used. (As Provided). |
Anmerkungen | Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2022/4/11 |