Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enKautz, Tim; Schochet, Peter Z.; Tilley, Charles
InstitutionNational Center for Education Evaluation and Regional Assistance (ED); Decision Information Resources, Inc.
TitelComparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation. NCEE 2017-4026
Quelle(2017), (192 Seiten)
PDF als Volltext (1); PDF als Volltext kostenfreie Datei (2) Verfügbarkeit 
ZusatzinformationWeitere Informationen
Spracheenglisch
Dokumenttypgedruckt; online; Monographie
SchlagwörterQuantitative Daten; Design; Randomized Controlled Trials; Quasiexperimental Design; Research Methodology; Educational Research; Intervention; Measures (Individuals); Equations (Mathematics); Computation; Program Evaluation; Program Effectiveness; Evaluation Methods; Experiments; Models; Regression (Statistics); Hierarchical Linear Modeling; Least Squares Statistics
AbstractA new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs to connect statistical methods to the building blocks of causal inference. They differ from model-based methods that have commonly been used in education research, including hierarchical linear model (HLM) methods and robust cluster standard error (RCSE) methods for clustered designs. In comparison to model-based methods, the design-based methods tend to make fewer assumptions about the nature of the data and also more explicitly account for known information about the experimental and sampling designs. While these theoretical differences suggest the corresponding estimates might differ, it is unclear how much of a practical difference it makes to use design-based methods versus more conventional model-based methods. This study addresses this question by re-analyzing nine past RCTs in the education area using both design- and model-based methods. The study uses real data, rather than simulated data, to better explore the differences that would arise in practice. In order to investigate the full scope of differences between the methods, the study uses data generated from different types of randomization designs commonly used in social policy research: (1) non-clustered designs in which individuals are randomized; (2) clustered designs in which groups are randomized; (3) non-blocked designs in which randomization is conducted for a single population; and (4) blocked (stratified) designs in which randomization is conducted separately within partitions of the sample. The study conducts the design-based analyses using "RCT-YES," a free software package funded by the Institute of Education Sciences (IES) that applies design-based methods to a wide range of RCT designs (www.rct-yes.com). This report focuses on two analyses that compare model- and design-based methods, both of which suggest there is little substantive difference in the results between the two methods. For both analyses, the study uses a reference model-based method that is similar to the one used in the original evaluation. In the first analysis, the study compares the reference model-based method to a design-based method with underlying assumptions that most closely align with those of the reference model-based method. In the second analysis, the report presents a sensitivity check that compares the reference model-based method to an alternative design-based method. In particular, the alternative method is based on the default settings in the" RCT-YES" software, which correspond to an alternative set of plausible assumptions. The findings from both analyses suggest that model- and design-based methods yield very similar results in terms of the magnitude of impact estimates, statistical significance of the impact estimates, and implications for policy. To contextualize the differences in impact estimates between design- and model-based methods, the report also presents a third analysis, which compares estimates from two commonly used model-based methods: (1) HLM methods; and (2) linear models with ordinary least squares (OLS) assumptions and RCSE to account for clustering. Importantly, this analysis suggests that the differences between the design- and model-based methods (with similar assumptions) are no greater than the differences that would arise between commonly used, model-based methods. The study suggests that researchers should select estimators with assumptions that best suit the goals of their study regardless of whether they use a design- or model-based approach. Moreover, researchers should consider the trade offs between different assumptions, and how these assumptions affect the interpretation of findings. Appended are: (1) Hierarchical linear model methods; and (2) Detailed description of studies and results. [For related reports see: "What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025" (ED575014)and "Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027 (ED575022).] (ERIC).
AnmerkungenNational Center for Education Evaluation and Regional Assistance. Available from: ED Pubs. P.O. Box 1398, Jessup, MD 20794-1398. Tel: 877-433-7827; Web site: http://ies.ed.gov/ncee/
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Da keine ISBN zur Verfügung steht, konnte leider kein (weiterer) URL generiert werden.
Bitte rufen Sie die Eingabemaske des Karlsruher Virtuellen Katalogs (KVK) auf
Dort haben Sie die Möglichkeit, in zahlreichen Bibliothekskatalogen selbst zu recherchieren.
Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: