2007
DOI: 10.11120/ened.2007.02010059
|View full text |Cite
|
Sign up to set email alerts
|

Peer review of team marks using a web-based tool: an evaluation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 4 publications
0
13
0
Order By: Relevance
“…Examples of qualitative and quantitative research can be found in previously published papers reported elsewhere such as Pond et al (2007), Robinson (2006) and Willmot and Crawford (2004, 2005, 2007). Such research was extremely important in creating an initial level of confidence in the method and building the foundations for the pedagogic and technical development being carried out by the current WebPA project team.…”
Section: Pedagogic Developmentmentioning
confidence: 99%
See 1 more Smart Citation
“…Examples of qualitative and quantitative research can be found in previously published papers reported elsewhere such as Pond et al (2007), Robinson (2006) and Willmot and Crawford (2004, 2005, 2007). Such research was extremely important in creating an initial level of confidence in the method and building the foundations for the pedagogic and technical development being carried out by the current WebPA project team.…”
Section: Pedagogic Developmentmentioning
confidence: 99%
“…Some say that in true teamwork, individuals should stand or fall by judgement of the team output but realists quickly recognise that, however sound the arguments may be for this, many students express strong opposition or disquiet and insist on individually assigned marks that can be demonstrably associated with the efforts and abilities of each student. The fairness of allocating equal marks was questioned by Willmot and Crawford (2007, p. 59), who cited the common fear that ‘a lazy student might benefit from the efforts of teammates or particularly diligent students may have their efforts diluted by weaker team members’. Pond, Coates and Palermo (2007, p. 12) found that ‘bunched group marks often show a low standard deviation and the use of peer review [peer‐moderated marking] can help to spread this when marks are reviewed at an individual level’.…”
Section: Introduction To Peer‐moderated Markingmentioning
confidence: 99%
“…On the other hand, the assessment often being considered as subjective because of a set of factors like the human nature of the assessor and the assessed; many efforts try to make it the most objective possible. Thus, to reduce subjectivity and make an equitable assessment of a group members working or learning collaboratively, several works were realized [2], [3] using peer assessment, to try to reduce the subjectivity and fairly assess learners. However, in reality and in the assessment of a working group, the attribution of a mark for all members of the group is examined, but the authors conclude that it is not the right approach to make a fair and more objective assessment.…”
Section: Introductionmentioning
confidence: 99%
“…In a later paper, , point out the importance of using the same peer assessment system each year, so that "by year three, students know that freeloaders fail modules" (p8). Willmot and Crawford (2007) discuss a web-based self/peer assessment tool. They show that results correlate reasonably well with 'fly-on-the-wall' mentor observations as to whether students are under-or over-performing, but not so well as to the level of under/ over-performance.…”
Section: Introductionmentioning
confidence: 99%