2015 IEEE International Conference on Data Mining Workshop (ICDMW) 2015
DOI: 10.1109/icdmw.2015.256
|View full text |Cite
|
Sign up to set email alerts
|

Including Content-Based Methods in Peer-Assessment of Open-Response Questions

Abstract: Abstract-Massive Open Online Courses (MOOCs) are attracting the attention of a huge number of students all around the world. These courses include different types of assignments in order to evaluate the student's knowledge. However, these assignments are designed to allow a straightforward automatic evaluation. But, in this way it is not possible to evaluate some skills that would require answering open-response questions. Peerassessment (the students are asked to assess other assignments), is an effective met… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
1

Year Published

2017
2017
2018
2018

Publication Types

Select...
3

Relationship

3
0

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 22 publications
(27 reference statements)
0
5
1
Order By: Relevance
“…This research is an extension of research described in two conference papers [17,16], but now with a considerably broader experimental setting. Moreover, the performance of our method has been considerably enhanced over that described in previous research that did not use answer content [7].…”
Section: Related Workmentioning
confidence: 84%
See 1 more Smart Citation
“…This research is an extension of research described in two conference papers [17,16], but now with a considerably broader experimental setting. Moreover, the performance of our method has been considerably enhanced over that described in previous research that did not use answer content [7].…”
Section: Related Workmentioning
confidence: 84%
“…We use a factorization method to train a utility function that estimates consensus in rankings of answers. This approach -inspired by a preference learning framework [15,9] -was used in previous research by us [6,7,16,17]. Answers can be represented by vectors of features, which have been acknowledged to be crucial for the success of peer assessment [18,19].…”
Section: Introductionmentioning
confidence: 99%
“…In Figure 7 we see that in the extreme value (β = 10) there are increases in the number of recommended news (aggregate diversity of the equation 8) of 26.7 and 34.7 percentage points according to the definition of novelty that we are trying to optimize. Figure 6 Weighted average (using the datasets of Table 1) of novelty (measured using (10) and (12) Weighted average (using the datasets of Table 1) of aggregate diversity (8) obtained for different values of β (18) represented in the horizontal axis. The vertical axis represents the increase in percentage points assuming 0 for β = 0…”
Section: Resultsmentioning
confidence: 99%
“…Finally, let us remark that if we have any extra information about readers or news, we just have to concatenate the vectorial representation described above with a vectorial representation of extra knowledge. This idea has been successfully used in [19,18]. For instance, we could have some valuable information about readers, like sex, age, previous interactions with the digital newspaper, etc... On the other hand, the news could have been described by using their contents.…”
Section: Representation Of Readers and Newsmentioning
confidence: 99%
“…The alternative is for the students that wrote the answers to also play a role in the assessment. Peer assessment has been explored as an efficient procedure to deal with this problem; see for instance (Kulkarni et al, 2015;Piech et al, 2013;Joachims, 2014, 2015;Sadler and Good, 2006;Shah et al, 2013;Labutov and Studer, 2016;Díez et al, 2013;Luaces et al, 2015aLuaces et al, ,b, 2017Formanek et al, 2017). It has been acknowledged as an activity that enhances student learning in Sun et al (2015).…”
Section: Introductionmentioning
confidence: 99%