Literaturnachweis - Detailanzeige
Autor/inn/en | Omarov, Nazarbek Bakytbekovich; Mohammed, Aisha; Alghurabi, Ammar Muhi Khleel; Alallo, Hajir Mahmood Ibrahim; Ali, Yusra Mohammed; Hassan, Aalaa Yaseen; Demeuova, Lyazat; Viktorovna, Shvedova Irina; Nazym, Bekenova; Al Khateeb, Nashaat Sultan Afif |
---|---|
Titel | Distractor Analysis in Multiple-Choice Items Using the Rasch Model |
Quelle | In: International Journal of Language Testing, 13 (2023), S.69-78 (10 Seiten)
PDF als Volltext |
Zusatzinformation | ORCID (Omarov, Nazarbek Bakytbekovich) ORCID (Nazym, Bekenova) |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
Schlagwörter | Test Items; Multiple Choice Tests; Item Response Theory; English (Second Language); Second Language Learning; Second Language Instruction; Language Tests; Undergraduate Students; High Stakes Tests; Higher Education; Grammar; Test Construction; Advanced Courses; Language Teachers; Test Reliability; Difficulty Level; Item Analysis; Foreign Countries; Iraq Test content; Testaufgabe; Multiple choice examinations; Multiple-choice tests, Multiple-choice examinations; Multiple-Choice-Verfahren; Item-Response-Theorie; English as second language; English; Second Language; Englisch als Zweitsprache; Zweitsprachenerwerb; Fremdsprachenunterricht; Language test; Sprachtest; Hochschulbildung; Hochschulsystem; Hochschulwesen; Grammatik; Testaufbau; Fortgeschrittenenunterricht; Language teacher; Sprachunterricht; Testreliabilität; Schwierigkeitsgrad; Itemanalyse; Ausland; Irak |
Abstract | The Multiple-choice (MC) item format is commonly used in educational assessments due to its economy and effectiveness across a variety of content domains. However, numerous studies have examined the quality of MC items in high-stakes and higher-education assessments and found many flawed items, especially in terms of distractors. These faulty items lead to misleading insights about the performance of students and the final decisions. The analysis of distractors is typically conducted in educational assessments with multiple-choice items to ensure high-quality items are used as the basis of inference. Item response theory (IRT) and Rasch models have received little attention for analyzing distractors. For that reason, the purpose of the present study was to apply the Rasch model, to a grammar test to analyze items' distractors of the test. To achieve this, the current study investigated the quality of 10 instructor-written MC grammar items used in an undergraduate final exam, using the items responses of 310 English as a foreign language (EFL) students who had taken part in an advanced grammar course. The results showed an acceptable fit to the Rasch model and high reliability. Malfunctioning distractors were identified. (As Provided). |
Anmerkungen | Tabaran Institute of Higher Education. Shariati 60, Shariati Blvd, Ghasem Abad, Mashhad, Khorasan Razavi, Iran. Tel: +98 (51) 35227215; e-mail: ijlt@tabaran.ac.ir; Web site: http://www.ijlt.ir/ |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |