Literaturnachweis - Detailanzeige
Autor/inn/en | Hoang, Giang Thi Linh; Kunnan, Antony John |
---|---|
Titel | Automated Essay Evaluation for English Language Learners: A Case Study of "MY Access" |
Quelle | In: Language Assessment Quarterly, 13 (2016) 4, S.359-376 (18 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 1543-4303 |
DOI | 10.1080/15434303.2016.1230121 |
Schlagwörter | Case Studies; Essays; Writing Evaluation; English (Second Language); Second Language Learning; Second Language Instruction; Correlation; Error Analysis (Language); Feedback (Response); Editing; Computer Assisted Testing; Statistical Analysis; Content Analysis; Evaluators; Accuracy; Plagiarism; Cues; Scoring Rubrics; Test Validity; Vietnamese People; Asians; College Students; Foreign Students; Foreign Countries; Regression (Statistics); California Case study; Fallstudie; Case Study; Essay; Aufsatzunterricht; English as second language; English; Second Language; Englisch als Zweitsprache; Zweitsprachenerwerb; Fremdsprachenunterricht; Korrelation; Error analysis; Language; Fehleranalyse; Redaktion; Textbearbeitung; Statistische Analyse; Inhaltsanalyse; Plagiat; Stichwort; Scoring formulas; Auswertungsbogen; Testvalidität; Asian; Asiat; Asiatin; Asiaten; Asiate; Collegestudent; Ausland; Regression; Regressionsanalyse; Kalifornien |
Abstract | Computer technology made its way into writing instruction and assessment with spelling and grammar checkers decades ago, but more recently it has done so with automated essay evaluation (AEE) and diagnostic feedback. And although many programs and tools have been developed in the last decade, not enough research has been conducted to support or evaluate the claims of the developers. This study examined the effectiveness of automated writing instructional programs in consistent scoring of essays and appropriate feedback to student writers. It examined the scoring and instructional program called "MY Access! Home Edition," which has an error feedback tool called "My Editor" to address these issues. The study combined a quantitative study of agreement and correlational analyses with an analysis of content and topic. Participants included 114 English language learners who wrote 147 essays to three writing prompts, which were graded by trained EFL raters and "MY Access." From the sample, 15 randomly selected essays were also used for an error analysis comparing "My Editor" with human annotations to examine "My Editor"'s accuracy. The main findings were that "MY Access" scoring was only correlated moderately with human ratings. Furthermore, because "MY Access" scoring is limited to the recognition of content words, not how these words are organized at the discourse level, it did not detect slightly off-topic essays and plagiarism. Finally, "My Editor"'s error feedback, with 73% precision and 30% recall, did not meet the expectations of an accurate tool. In conclusion, the home edition of "MY Access" was not found to be useful as an independent instructional tool. These findings give us pause regarding the effectiveness of "MY Access." (As Provided). |
Anmerkungen | Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2020/1/01 |