Literaturnachweis - Detailanzeige
Autor/inn/en | Ilagan, Michael John; Falk, Carl F. |
---|---|
Titel | Supervised Classes, Unsupervised Mixing Proportions: Detection of Bots in a Likert-Type Questionnaire |
Quelle | In: Educational and Psychological Measurement, 83 (2023) 2, S.217-239 (23 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Zusatzinformation | ORCID (Falk, Carl F.) |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0013-1644 |
DOI | 10.1177/00131644221104220 |
Schlagwörter | Likert Scales; Questionnaires; Artificial Intelligence; Identification; Computer Mediated Communication; Accuracy; Models; Item Response Theory; Classification |
Abstract | Administering Likert-type questionnaires to online samples risks contamination of the data by malicious computer-generated random responses, also known as bots. Although nonresponsivity indices (NRIs) such as person-total correlations or Mahalanobis distance have shown great promise to detect bots, universal cutoff values are elusive. An initial calibration sample constructed via stratified sampling of bots and humans--real or simulated under a measurement model--has been used to empirically choose cutoffs with a high nominal specificity. However, a high-specificity cutoff is less accurate when the target sample has a high contamination rate. In the present article, we propose the supervised classes, unsupervised mixing proportions (SCUMP) algorithm that chooses a cutoff to maximize accuracy. SCUMP uses a Gaussian mixture model to estimate, unsupervised, the contamination rate in the sample of interest. A simulation study found that, in the absence of model misspecification on the bots, our cutoffs maintained accuracy across varying contamination rates. (As Provided). |
Anmerkungen | SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: https://sagepub.com |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2024/1/01 |