2015
DOI: 10.1016/j.procs.2015.12.120
|View full text |Cite
|
Sign up to set email alerts
|

Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(15 citation statements)
references
References 20 publications
0
15
0
Order By: Relevance
“…In conclusion, our results demonstrate the efficacy of EDM, even with a relatively small dataset. We believe that the simple workflow proposed in this paper could easily be replicated by educators via referencing the information and tools provided by the repository of ProgEdu 1 . We claim that the information collected from assignment submissions is valuable for multiple educational purposes, for instance, to improve the assignment design for the next versions of the course, and to be featured for learning analytics related to students' behavior.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In conclusion, our results demonstrate the efficacy of EDM, even with a relatively small dataset. We believe that the simple workflow proposed in this paper could easily be replicated by educators via referencing the information and tools provided by the repository of ProgEdu 1 . We claim that the information collected from assignment submissions is valuable for multiple educational purposes, for instance, to improve the assignment design for the next versions of the course, and to be featured for learning analytics related to students' behavior.…”
Section: Discussionmentioning
confidence: 99%
“…APASs are widely accepted by both students and institutions via their fruitful benefits. First, they reduce the educators' workload in code evaluation as well as provide accurate grading and immediate feedback to learners [1].…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, a collaboration with Artificial Intelligence (AI) discipline to develop a tool to intelligently design assignment requirements is needed—of course with the instructor's expertise. Creating sufficient set of test cases that can verify all special cases: The student's code will pass through a set of test cases to ensure that the code's functionalities completely cover the requirements. These tests can be created automatically or manually. Thus, it is important to design the test cases in parallel with the assignment requirements to ensure the coverage.…”
Section: Discussion and Recommendationsmentioning
confidence: 99%
“…Tracking errors manually for every program is difficult and time‐consuming [13,17]. Moreover, manual assessment of student coding is rather heavy on instructors and may eventually conclude the reporting of feedback as well as lead to discrepancies [4,41] from more than one instructor in the assessment. Therefore, due to the rapid increase in the number of students in programming courses and to lower the burden of educators in manual assessment, the AA in programming courses has attracted the attention of many researchers as well as academicians.…”
Section: Introductionmentioning
confidence: 99%
“…FaSt-generator [21] is based on an automated programming system for Java language. This system allows the teachers to provide a set of test data and the students to develop their programming skills through feedbacks and through learning from their own unexpected mistakes.…”
Section: Sac [14]mentioning
confidence: 99%