Literaturnachweis - Detailanzeige
Autor/inn/en | Shin, Jinnie; Gierl, Mark J. |
---|---|
Titel | Evaluating Coherence in Writing: Comparing the Capacity of Automated Essay Scoring Technologies |
Quelle | In: Journal of Applied Testing Technology, 23 (2022), S.4-20 (17 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
Schlagwörter | Computer Assisted Testing; Scoring; Essays; Automation; Artificial Intelligence; Scores; Models; Prompting; Prediction |
Abstract | Automated Essay Scoring (AES) technologies provide innovative solutions to score the written essays with a much shorter time span and at a fraction of the current cost. Traditionally, AES emphasized the importance of capturing the "coherence" of writing because abundant evidence indicated the connection between coherence and the overall writing quality yet, limited studies have been conducted to investigate the capacity of the modern and traditional automated essay scoring technologies in capturing the sequential information (i.e., cohesion). In this study, we investigate the performance of traditional and modern AES systems in attribute-specific scoring. Traditional AES focuses on holistic scoring with limited application for the attribute-specific scoring. Hence, the current study focuses on understanding whether a deep-neural AES system using a convolutional neural networks approach could provide better performance in attribute-specific essay scoring compared to a traditional feature-based AES system in capturing coherence scores in essays. Our finding indicated that a deep-neural AES model showed improved accuracy in predicting coherence-related score categories. Implications for the scoring capacity of the two models are also discussed. (As Provided). |
Anmerkungen | Association of Test Publishers. 601 Pennsylvania Avenue NW, South Building Suite 900, Washington DC 20004. Tel: 866-240-7909; Fax: 717-755-8962; e-mail: wgh.atp@att.net; Web site: https://jattjournal.net/index.php/atp/index |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |