Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enMatta, Michael; Mercer, Sterett H.; Keller-Margulis, Milena A.
TitelImplications of Bias in Automated Writing Quality Scores for Fair and Equitable Assessment Decisions
QuelleIn: School Psychology, 38 (2023) 3, S.173-181 (9 Seiten)Infoseite zur Zeitschrift
PDF als Volltext Verfügbarkeit 
ZusatzinformationORCID (Matta, Michael)
Weitere Informationen
Spracheenglisch
Dokumenttypgedruckt; online; Zeitschriftenaufsatz
ISSN2578-4218
DOI10.1037/spq0000517
SchlagwörterBias; Automation; Writing Evaluation; Scoring; Writing Tests; Elementary School Students; Middle School Students; Grade 4; Grade 7; Predictive Validity; Essays
AbstractRecent advances in automated writing evaluation have enabled educators to use automated writing quality scores to improve assessment feasibility. However, there has been limited investigation of bias for automated writing quality scores with students from diverse racial or ethnic backgrounds. The use of biased scores could contribute to implementing unfair practices with negative consequences on student learning. The goal of this study was to investigate score bias of writeAlizer, a free and open-source automated writing evaluation program. For 421 students in Grades 4 and 7 who completed a state writing exam that included composition and multiple choice revising and editing questions, writeAlizer was used to generate automated writing quality scores for the composition section. Then, we used multiple regression models to investigate whether writeAlizer scores demonstrated differential predictions of the composition and overall scores on the state-mandated writing exam for students from different racial or ethnic groups. No evidence of bias for automated scores was observed. However, after controlling for automated scores in Grade 4, we found statistically significant group differences in regression models predicting overall state test scores 3 years later but not the essay composition scores. We hypothesize that the multiple choice revising and editing sections, rather than the scoring approach used for the essay portion, introduced construct-irrelevant variance and might lead to differential performance among groups. Implications for assessment development and score use are discussed. [For the corresponding grantee submission, see ED628830.] (As Provided).
AnmerkungenAmerican Psychological Association. Journals Department, 750 First Street NE, Washington, DC 20002. Tel: 800-374-2721; Tel: 202-336-5510; Fax: 202-336-5502; e-mail: order@apa.org; Web site: http://www.apa.org
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2024/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Bibliotheken, die die Zeitschrift "School Psychology" besitzen:
Link zur Zeitschriftendatenbank (ZDB)

Artikellieferdienst der deutschen Bibliotheken (subito):
Übernahme der Daten in das subito-Bestellformular

Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: