2019
DOI: 10.1002/rrq.281
|View full text |Cite
|
Sign up to set email alerts
|

Automated Scoring of Students’ Use of Text Evidence in Writing

Abstract: Despite the importance of analytic text‐based writing, relatively little is known about how to teach to this important skill. A persistent barrier to conducting research that would provide insight on best practices for teaching this form of writing is a lack of outcome measures that assess students’ analytic text‐based writing development and that are feasible to implement at scale. Automated essay‐scoring (AES) technologies offer one potential approach to increasing the feasibility of research in this area, p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 74 publications
0
3
0
2
Order By: Relevance
“…The connection between the educational philosophy and practices in the use of technology are of vital importance (Andrei, 2017). Although the AES systems are able to score essays similarly to human raters (Correnti et al, 2020; Wilson, 2018), they are often accepted as scoring tools that merely complement, but cannot replace expert human raters, either in high-stakes or low-stakes assessments.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The connection between the educational philosophy and practices in the use of technology are of vital importance (Andrei, 2017). Although the AES systems are able to score essays similarly to human raters (Correnti et al, 2020; Wilson, 2018), they are often accepted as scoring tools that merely complement, but cannot replace expert human raters, either in high-stakes or low-stakes assessments.…”
Section: Discussionmentioning
confidence: 99%
“…Although the scores generated by AESs are routinely close to the scores of human raters in writing assessments (Correnti et al, 2020; Rudner et al, 2006; Warschauer & Grimes, 2008), AESs are often touted as instruments to supplement, rather than to replace expert human raters. For instance, an AES engine is used by the GMAT Analytic Writing Assessment in parallel with one human rater to generate two scores for each essay.…”
Section: Introductionmentioning
confidence: 99%
“…Beraz, oso testu-bilduma handiak prozesatzeko eta analizatzeko aukera berriak daude. Hala, euskara-ikasleek ekoitzitako testuen maila neurtzeko probak egin eta zuzendu litezke hizkuntza-prozesamendurako tresnak erabiliz (Correnti et al, 2019;Maamuujav et al, 2021).…”
Section: Helduen Euskalduntzearen Aldizkariaunclassified
“…An important research area aimed at measuring the individual competence of students by using CL methods is that of writing assessment. In recent years, the application of CL methods has been particularly evident within the field of automated essay scoring, in which researchers try to develop software tools that can automatically analyze features in texts and on that basis assign similar texts with a certain score (Correnti et al, 2020;Shermis, 2014). The theoretical underpinnings of many of these studies are often located in probabilistic inference or more widely within the field of psycholinguistics (Bod, 2009;Jurafsky, 2002).…”
Section: The Studentmentioning
confidence: 99%