Rahimi, Z and Litman, DJ and Correnti, R and Matsumura, LC and Wang, E and Kisa, Z
(2014)
Automatic scoring of an analytical response-to-text assessment.
In: UNSPECIFIED.
![[img]](http://d-scholarship.pitt.edu/style/images/fileicons/text_plain.png) |
Plain Text (licence)
Available under License : See the attached license file.
Download (1kB)
|
Abstract
In analytical writing in response to text, students read a complex text and adopt an analytic stance in their writing about it. To evaluate this type of writing at scale, an automated approach for Response to Text Assessment (RTA) is needed. With the long-term goal of producing informative feedback for students and teachers, we design a new set of interpretable features that operationalize the Evidence rubric of RTA. When evaluated on a corpus of essays written by students in grades 4-6, our results show that our features outperform baselines based on well-performing features from other types of essay assessments. © 2014 Springer International Publishing Switzerland.
Share
Citation/Export: |
|
Social Networking: |
|
Details
Metrics
Monthly Views for the past 3 years
Plum Analytics
Altmetric.com
Actions (login required)
 |
View Item |