Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enVerger, Mélina; Lallé, Sébastien; Bouchet, François; Luengo, Vanda
TitelIs Your Model "MADD"? A Novel Metric to Evaluate Algorithmic Fairness for Predictive Student Models
[Konferenzbericht] Paper presented at the International Conference on Educational Data Mining (EDM) (16th, Bengaluru, India, Jul 11-14, 2023).
Quelle(2023), (12 Seiten)
PDF als Volltext kostenfreie Datei Verfügbarkeit 
Spracheenglisch
Dokumenttypgedruckt; online; Monographie
SchlagwörterPrediction; Models; Student Behavior; Academic Achievement; Online Courses; Algorithms; Bias; Foreign Countries; STEM Education; Social Studies; College Students; United Kingdom
AbstractPredictive student models are increasingly used in learning environments due to their ability to enhance educational outcomes and support stakeholders in making informed decisions. However, predictive models can be biased and produce unfair outcomes, leading to potential discrimination against some students and possible harmful long-term implications. This has prompted research on fairness metrics meant to capture and quantify such biases. Nonetheless, so far, existing fairness metrics used in education are predictive performance-oriented, focusing on assessing biased outcomes across groups of students, without considering the behaviors of the models nor the severity of the biases in the outcomes. Therefore, we propose a novel metric, the Model Absolute Density Distance (MADD), to analyze models' discriminatory behaviors independently from their predictive performance. We also provide a complementary visualization-based analysis to enable fine-grained human assessment of how the models discriminate between groups of students. We evaluate our approach on the common task of predicting student success in online courses, using several common predictive classification models on an open educational dataset. We also compare our metric to the only predictive performance-oriented fairness metric developed in education, ABROCA. Results on this dataset show that: (1) fair predictive performance does not guarantee fair models' behaviors and thus fair outcomes; (2) there is no direct relationship between data bias and predictive performance bias nor discriminatory behaviors bias; and (3) trained on the same data, models exhibit different discriminatory behaviors, according to different sensitive features too. We thus recommend using the MADD on models that show satisfying predictive performance, to gain a finer-grained understanding on how they behave and regarding who and to refine models selection and their usage. Altogether, this work contributes to advancing the research on fair student models in education. Source code and data are in open access at https://github.com/melinaverger/MADD. [For the complete proceedings, see ED630829.] (As Provided).
AnmerkungenInternational Educational Data Mining Society. e-mail: admin@educationaldatamining.org; Web site: https://educationaldatamining.org/conferences/
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2024/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Da keine ISBN zur Verfügung steht, konnte leider kein (weiterer) URL generiert werden.
Bitte rufen Sie die Eingabemaske des Karlsruher Virtuellen Katalogs (KVK) auf
Dort haben Sie die Möglichkeit, in zahlreichen Bibliothekskatalogen selbst zu recherchieren.
Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: