Link to the University of Pittsburgh Homepage
Link to the University Library System Homepage Link to the Contact Us Form

Using the Instructional Quality Assessment toolkit to investigate the quality of reading comprehension assignments and student work (CSE Report 669)

Matsumura, Lindsay Clare and Slater, Sharon Cadman and Junker, Brian and Peterson, Maureen and Crosson, Amy and Resnick, Lauren B (2006) Using the Instructional Quality Assessment toolkit to investigate the quality of reading comprehension assignments and student work (CSE Report 669). Technical Report. National Center for Research on Evaluation, Standards, and Student Testing (CRESST), University of California, Graduate School of Education & Information Studies.

[img]
Preview
PDF
Published Version
Available under License : See the attached license file.

Download (1MB)
[img] Plain Text (licence)
Available under License : See the attached license file.

Download (1kB)

Abstract

This study presents preliminary findings from research developing an instructional quality assessment (IQA) toolkit that could be used to monitor the influence of reform initiatives on students' learning environments and to guide professional development efforts within a school or district. This report focuses specifically on the portion of the IQA used to evaluate the quality of teachers' reading comprehension assignments and student work. Results are limited due to a very small sample of participating teachers (N = 13, 52 assignments), and indicate a poor to moderate level of inter-rater agreement and a good degree of consistency for the dimensions measuring academic rigor, but not the clarity of teachers' expectations. The rigor of the assignments collected from teachers also was associated with the rigor of observed instruction. Collecting four assignments (two challenging and two recent) from teachers did not yield a stable estimate of quality. Additional analyses looking separately at the two different assignment types indicate, however, that focusing on one assignment type would yield a stable estimate of quality. This suggests that the way in which assignments are collected from teachers should be revised. Implications for professional development are also discussed. The 2003 Draft Observation and Assignment Rubrics for Reading Comprehension is appended. (Contains 6 tables, 4 figures, and 4 footnotes.)


Share

Citation/Export:
Social Networking:
Share |

Details

Item Type: Monograph (Technical Report)
Status: Published
Creators/Authors:
CreatorsEmailPitt UsernameORCID
Matsumura, Lindsay Clarelclare@pitt.eduLCLARE
Slater, Sharon Cadman
Junker, Brian
Peterson, Maureen
Crosson, Amy
Resnick, Lauren Bresnick@pitt.eduRESNICK
Monograph Type: Technical Report
Date: 2006
Date Type: Publication
Access Restriction: No restriction; Release the ETD for access worldwide immediately.
Publisher: National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
Place of Publication: University of California, Graduate School of Education & Information Studies
Institution: National Center for Research on Evaluation, Standards, and Student Testing
Schools and Programs: School of Education > Learning Sciences and Policy
Refereed: Yes
Official URL: http://eric.ed.gov/?q=+Using+the+Instructional+Qua...
Funders: Institute of Education Sciences()
Other ID: ED492897, CSE Report 669
Additional Information: Copyright holders: The Regents of the University of California
Date Deposited: 09 Oct 2015 17:30
Last Modified: 31 Jul 2020 19:10
URI: http://d-scholarship.pitt.edu/id/eprint/26207

Metrics

Monthly Views for the past 3 years

Plum Analytics


Actions (login required)

View Item View Item