Evaluation of report-based assignments, especially in larger classes, adds a considerable marking load. Even with detailed rubrics, subjectivity may lead to grading variations and inaccuracies. Evaluation of others’ work can also be a very informative and educational experience, improving their skill through exposure to a broader performance range. Involving students in peer evaluation can potentially address both of these issues by reducing marking load, providing alternate (and increased number of) assessments, and by exposing students to a broader spectrum of report skills thus enhancing their own knowledge. This paper discusses the results of an experiment in peer assessment and whether it can be exploited to reduce marking effort, improve accuracy for report assignment evaluation and improve student skill. The data was gathered from assignments in two different engineering classes: a second year course on safety and environmental stewardship, and a senior course on engineering economics. For the second-year course, an individual essay assignment was marked by the instructor and two peers. The three evaluations were analyzed to assess the accuracy and assign a grade. For the senior course, a group report on a case study was self and peer evaluated. These evaluations were used to derive a grade for the report directly if the self and peer results were within a prescribed tolerance; other cases were resolved by instructor intervention. The results were analyzed considering the number of outliers, range of scores, and the number of cases which had to be resolved by theinstructor. Parameters considered in assessing the results of the experiment included: the correlation between assessments, the learning opportunities for students, and instructor marking effort required. (preliminary analysis) Results suggest positive gains in reducing effort. Improved accuracy and enhanced student learning are also expected.
This paper discusses the results of two experiments in self assessment and discusses their value in evaluating student consciousness of their competence, and the opportunity to improve self-awareness and competence in students. The data was gathered from two different engineering courses. The first experiment was conducted in a second-year course on basic electronics and electrical power. As part of the final examination, students were asked to assess their confidence in their answer to each question. The student self-assessment was compared to the actual result in an effort to determine the student’s perception of their competence. Student assessment was coded with respect to consciousness and competence. The second experiment was performed on a midterm examination in engineering ethics and professionalism, a senior course discussing the impact and interaction of the engineering profession on society. Students were given an annotated exemplar and a marking rubric and asked to grade their own midterm submissions. The student assessments were compared to the instructor assessment and again the results were coded with respect to consciousness and competence. The results showed a contrast between the second-year and senior courses. For the second-year course, 50.3% were coded as consciously competent or incompetent. In the senior course, 80% of students were coded as consciously competent. The comparison of the two results suggest that senior students, given suitable instruction, are more aware of their competence than junior students suggesting that current methods do develop an improved awareness of competence, although other factors may be relevant. It is suggested that student awareness be formally monitored, and results used to modify pedagogy to improve and accelerate consciousness in graduates.
Providing summative feedback to studentsin a timely fashion, and managing the associated markingin larger classes has been a perpetual challenge in aneducation environment, and is even more so in a resourcechallenged environment. This paper discusses the resultsof an experiment in evaluation in an engineering courseby implementing a modified evaluation and gradingapproach. The objectives were to i) provide timelyfeedback to students, ii) improve engagement and reduceoverall course loading for students, and iii) reducemarking effort for instructors, all without negativelyaffecting student grade performance. The results showthat improvements over traditional methods can be madein two of the three areas.The course in question, (redacted), covers basicelectrical concepts and devices for non-electricalengineering students. The course had been offered in fourprevious years using a traditional evaluative approach:weekly assignments (submitted, marked and returned),laboratory exercises (comprehensive reporting orexercises submitted, marked and returned), midterm(s)(graded and returned), and a final examination. Themodified approach was implemented over the past twoyears and included the same learning strategies, but witha potentially lower resource commitment for students andinstructors. Modifications to the strategy wereimplemented the first and second years. The experimentintroduced procedural and administrative modificationsin assignments, laboratories and examinations, and theaddition of short weekly quizzes to improve engagementin an active learning environment.Approximately ten assignments were offered to helpstudents test and improve their understanding andknowledge. In the first year, assignments and solutionswere posted simultaneously; no submission was requiredand there was no grade contribution offered. Therationale for this strategy was that students would receivevirtually instant feedback by having solutions immediatelyavailable, and the freedom to judge the quantity and levelof completion required to meet their individual learningneeds. In the second year, assignments wereadministered through an online assignment system formark credit. This was intended to reinstate the incentiveof mark credit to improve student engagement while stillproviding instantaneous feedback on correctness.The course has always included a critical “hands-on”laboratory component which was traditionally timeintensive for both students and instructors. While thelaboratory submissions were still required for markcredit, the reporting requirement was reduced to aminimal, specified sampling of results to provide evidencethat the practical work was addressed. Expectedoutcomes were again provided for students to providerelevant and timely feedback. In the second year, a 3-bingrading system was adopted to improve the granularity ofthe marks while still requiring considerably less markingeffort.Examinations were also modified to improve timelinessof feedback and reduce marking effort. In the first year,three “midterm” examinations were distributed throughthe term to monitor student learning and verify studentparticipation in the self-directed parts of the course.Each of these exams consisted of 12 questions and weresimply graded on a correct response (no “partial marks)to reduce marking effort. In the second year, two midtermexaminations were deemed sufficient, but were gradedusing a 3-bin approach, thus allowing for “partialmarks”. Exams were returned to students in the nextlecture period in both cases. The Final examination inthe first year was designed using a 3-bin scheme to allowfor partial marks while still reducing marking effort. Inthe second year, this was increased to a 4-bin scheme toimproved granularity. The increase in granularity hadvery little effect on marking effort for both the midtermand final.One additional modification was made in the secondyear with the addition of brief weekly quizzes, for markcredit, to encourage students to complete assigned prereadingexercises and keep up with course work. Thequizzes consisted of two brief questions: one on assignedreading for the coming week and one on the previousweek’s material. These quizzes were administered andgraded using a classroom response system and automatically integrated with the learning managementsystem.Analysis consisted of comparison of grades withprevious years, anecdotal evidence and observations onstudent effort, course evaluation data and survey results.Preliminary results indicate student load and instructormarking effort were significantly reduced. While graderesults were approximately the same. A direct objectivecomparison with previous years is not significant due tovariations in course content and cohort.
As members of a learned profession, engineers are often required to assess and critique the work of others. Preparation for this professional responsibility should be developed during their academictraining, alongside other required skills. This authorproposes that there are generic skills and trainingmethodologies that can be applied to both technical and“soft skill” situations to prepare students for this task.This paper discusses results of a peer assessment exerciseapplied to a “soft-skills” situation.The main objectives of this experiment were to i)develop peer assessment skills in students, ii) maintain orimprove the accuracy of assessments for subjectivematerial, iii) improve students’ skills in the subject area,and iv) potentially reduce marking effort for instructors.The experiment described in this paper involved peerassessment of a short report (3 – 5 pages) required as aterm assignment in a senior course on ethics andprofessionalism. The reports were prepared andsubmitted by groups of two students. Each student wasthen randomly assigned two other reports to assess in adouble-blind fashion, except that no student reviewerreceived their own report. For reference and analysis,each report was also assessed by both the instructor anda Teaching Assistant resulting in approximately sixseparate assessments per report The results were used todetermine a grade for the assignment. The originalassignment rubric was used for all assessments. Inaddition, formative feedback was provided by thereviewers and returned to the authors.The quality of the numerical results was analyzed bycomparing the marks determined by the student assessorsto the reference (instructor, TA) assessments. An averagedifference of 8.5% was observed, and was consideredgenerally acceptable given the subjective nature of thematerial. Student “generosity bias” was also considered,but found to be virtually non-existent with a difference instudent versus reference averages of less than 0.2%.“Outliers” were anticipated, and student assessmentshowed approximately twice the standard deviation of thereference marks. A weighted average was used todetermine the assignment mark, and any marks outside a20.0% band were de-weighted. Approximately 25% ofcases were weight-adjusted, resulting in a maximum markadjustment of 4.1% and an average adjustment of only1.6%.Feedback was solicited from students prior to the peerreview period and at the end of term. Informal feedbackwas solicited prior to the review period regardinginstructions and logistics, and was used to refine the setupfor the peer review phase. Questions on the value of boththe exercise and the feedback provided were included inan end-of-term survey of students about the course, with83% finding the exercise “a bit” or “quite” educationaland 74% finding the peer feedback “a bit” or “quite”helpful.Involving students in this peer evaluation exercise hadgenerally positive outcomes and provided experiencefrom which to improve future implementation of peerassessments to achieve the objectives of this experiment.Recommendations regarding future application include:importance of instructions and setup, student training and rehearsal, and mark determination considerations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.