Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inSong, Yang
TitelData Sharing in Peer-Assessment Systems for Education
Quelle(2017), (101 Seiten)
PDF als Volltext Verfügbarkeit 
Ph.D. Dissertation, North Carolina State University
Spracheenglisch
Dokumenttypgedruckt; online; Monographie
ISBN978-0-3556-3545-4
SchlagwörterHochschulschrift; Dissertation; Data; Shared Resources and Services; Peer Evaluation; Information Storage; Hypermedia; Databases; Serial Ordering; Sociometric Techniques
AbstractFifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups; therefore, their designs are all different from each other: rating-based or ranking-based, reviews assigned randomly or to fixed groups, anonymous or onymous review, etc. Though there are different systems and a large number of users for each, there is a dearth of comparisons between different designs. This is mainly caused by the fact that the data generated by peer assessment systems is stored and analyzed separately; there is no standard for data sharing in this research community. In this work, we focus on the data sharing between educational peer assessment systems. We designed a Peer-Review Markup Language (PRML) as a generic data schema to modeling and sharing data generated by different educational peer assessment systems. Based on PRML, a data warehouse can be built and different systems can ETL (Extract, Transform and Load) their data, contribute the data to the common data warehouse and share the data with other researchers. Making use of data shared by different peer assessment systems can help researchers to answer more general research questions, e.g. are reviewers more reliable in ranking-based or rating-based peer assessment? To answer this question, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there no such pattern for rating-based assessors. Another research question that can be answered with this shared data is, how do collusions harm the peer review process? Ideally, if only a small number of students try to "game" the peer assessment process, the overall validity will not be affected much. However, one researcher found from his experience that more students became colluders through a semester--they gave each other high scores, or, even worse, gave high scores to every artifact they reviewed. In the worst case, a big number of colluders may make the honest reviewers outliers, which harms the validity of peer assessment. We have defined two different patterns of possible collusions and apply graph mining algorithms to detect the colluders in the data shared with us. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.] (As Provided).
AnmerkungenProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Die Wikipedia-ISBN-Suche verweist direkt auf eine Bezugsquelle Ihrer Wahl.
Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: