2016
DOI: 10.5539/ijel.v6n5p54
|View full text |Cite
|
Sign up to set email alerts
|

The Effect of Using Automated Essay Evaluation on ESL Undergraduate Students’ Writing Skill

Abstract: Advances in Natural Language Processing (NLP) have yielded significant advances in the language assessment field. The Automated Essay Evaluation (AEE) mechanism relies on basic research in computational linguistics focusing on transforming human language into algorithmic forms. The Criterion® system is an instance of AEE software providing both formative feedback and an automated holistic score. This paper aims to investigate the impact of this newly-developed AEE software in a current ESL setting by measuring… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 23 publications
(15 citation statements)
references
References 23 publications
0
15
0
Order By: Relevance
“…This, in turn, could allow educators to consider a broader range of assessment methods than only using multiple-choice tests to assess students' knowledge and abilities. In general, it appears that AI-powered essay ratings are comparable to human ratings, notwithstanding some areas of concern (Aluthman, 2016).…”
Section: Teacher-facing Ai Applicationsmentioning
confidence: 85%
“…This, in turn, could allow educators to consider a broader range of assessment methods than only using multiple-choice tests to assess students' knowledge and abilities. In general, it appears that AI-powered essay ratings are comparable to human ratings, notwithstanding some areas of concern (Aluthman, 2016).…”
Section: Teacher-facing Ai Applicationsmentioning
confidence: 85%
“…The benefits of using algorithms that find patterns in text responses, however, has been found to lead to encouraging more revisions by students (Ma & Slater, 2015) and to move away from merely measuring student knowledge and abilities by multiple choice tests (Nehm, Ha, & Mayfield, 2012). Continuing issues persist, however, in the quality of feedback provided by AES (Dikli, 2010), with Barker (2011) finding that the more detailed the feedback provided was, the more likely students were to question their grades, and a question was raised over the benefits of this feedback for beginning language students (Aluthman, 2016).…”
Section: Assessment and Evaluationmentioning
confidence: 99%
“…A study conducted in three EFL classrooms found that respondents in all three classes perceived the use of the AEE system unfavorably (Chen and Cheng, 2008). Others were concerned that AEE could be easily fooled to assign high scores to essays which were long, syntactically complex and replete with sophisticated vocabulary (Wilson and Czik, 2016) Therefore, further research was required to test the efficiency of these systems and to Computerassisted EFL writing determine which areas in the writing construct could be effectively improved via AEE systems (Aluthman, 2016). In addition, integrating technological advances into intelligent training models to help learners build conceptualized language ability was also a grand challenge for the future development of AEE tools (Winke and Isbell, 2017).…”
Section: Introduction 11 Literature Reviewmentioning
confidence: 99%