Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inMearman, Kimberly A.
TitelA Statistical Estimate of the Validity and Reliability of a Rubric Developed by Connecticut's State Education Resource Center to Evaluate the Quality of Individualized Education Programs for Students with Disabilities
Quelle(2013), (174 Seiten)
PDF als Volltext Verfügbarkeit 
Ph.D. Dissertation, Andrews University
Spracheenglisch
Dokumenttypgedruckt; online; Monographie
ISBN978-1-3036-4372-9
SchlagwörterHochschulschrift; Dissertation; Construct Validity; Reliability; Scoring Rubrics; Individualized Education Programs; Disabilities; Program Evaluation; Special Education; Scores; Interrater Reliability; Evaluation Methods; State Departments of Education; Data Analysis; Connecticut
AbstractBecause of the critical function of the IEP in the planning and implementation of effective instruction for students with disabilities, educators need a reference to determine the standards of a quality IEP and a process by which to compare an IEP to those standards. A rubric can support educators in examining the quality of IEPs. This study used data previously collected by SERC as part of a training of scorers using SERC's IEP Rubric. This scoring process was part of a program-evaluation process for a grant coordinated by SERC. SERC collected IEPs from schools participating in a technical assistance grant to conduct a pre-post program evaluation of changes in practices regarding IEP development. All the IEPs collected by SERC were samples of IEPs written by educators in the participating schools, and all identifiable information was removed from the IEPs. In the program evaluation conducted by SERC, SERC consultants served as the scorers. Scorers were all educators. They represented general education, special education, and related-service providers. SERC's program-evaluation process was conducted for each round of participating schools. The process of scoring IEPs occurred in the first year and the final year of the school's participation in the technical assistance. SERC provided training to scorers each time before they scored the IEPs from participating schools as a means to ensure the fidelity of the scoring process. This training was provided to all scorers even if they had been through the training previously. As a measure of calibration among scorers, scorers were asked to complete several tasks that assessed their understanding of both the content and the use of the rubric. This included two rounds of independent scoring of two different IEPs; a process for jurying scores with a partner; assessing the key concepts within the indicators; and using another IEP rubric developed by Hunt, Goetz, and Anderson. Data were collected and used from these tasks to assess the estimated reliability of the scorers. These same data were used for this study to estimate the inter-reliability and validity of SERC's IEP rubric. The scores from SERC's IEP rubric were compared to verify the estimated inter-rater reliability among scorers. The scores collected from the IEP rubric developed by Hunt et al (1986). were used to estimate the construct validity by examining how the content of the two rubrics compares across similar content. This study also used the assessment of the key concepts within the indicators to estimate the content and expert-judge validity using a Table of Specifications (ToS). This method provided both quantitative and qualitative feedback on the content of SERC's IEP rubric, based on the experience and knowledge of the scorers from their specific expert areas of expertise. The results of the estimated inter-rater reliability indicated that SERC's IEP Rubric had met the standard of an acceptable estimated inter-rater reliability using the Interclass Correlations (ICC) to analyze the scores across all scorers. There were no significant differences in the estimated inter-reliability between scorers with experience writing IEPs and scorers without any experience writing. In addition, there was no significant difference in the estimated inter-reliability between independent scoring and juried scoring. The results of the concurrent validity indicated a moderate positive relationship between the SERC IEP rubric and the IEP rubric developed by Hunt, Goetz, and Anderson. The results suggested that an IEP which scored high using the SERC IEP rubric would also score high using the rubric developed by Hunt et al (1986). The results of the estimated content and expert-judge validity indicated that SERC's IEP rubric has a range of Lawshe's Content-Validity Ratio (CVR) scores for each indicator on the rubric from 0.57 to 1. This range was above the Lawshe's CVR minimum acceptable value for 20 expert-judges of 0.42. The major qualitative themes from the open responses using the ToS method related to minor adjustment in content and training for scorers. The purpose of this study was to estimate the validity and reliability of a rubric developed by Connecticut's State Education Resource Center (SERC). This rubric was designed to assess the quality of individualized education program (IEPs) for students with disabilities. This study benefited SERC and its use of SERC's IEP rubric as an instrument in a program evaluation to measure the impact of its training and technical assistance to schools. The results of this study provided SERC with data to demonstrate an acceptable level of estimated inter-rater reliability and concurrent, content, and expert-judge validity. The analysis of data, including the qualitative data, supported SERC in its efforts to train scorers. Replicating aspects of this study with a broader sample of educators currently working in school systems and content experts would strengthen the generalization of these results in terms of its use in school districts. There are implications for the use of this rubric to support educators, families, and policies in the development of high-quality IEPs. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.] (As Provided).
AnmerkungenProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Die Wikipedia-ISBN-Suche verweist direkt auf eine Bezugsquelle Ihrer Wahl.
Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: