2010
DOI: 10.1007/978-3-642-16552-8_31
|View full text |Cite
|
Sign up to set email alerts
|

Quality in Learning Objects: Evaluating Compliance with Metadata Standards

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
5
0
1

Year Published

2016
2016
2017
2017

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 11 publications
0
5
0
1
Order By: Relevance
“…Moreover, the evaluation can be either quantitative or qualitative. In a later work, the authors detail the form as determining the compliance sub-characteristic (part of the functionality characteristic) through six indicators (Vidal et al, 2010). The indicators are metadata standardization, completeness, correctness, clarity, congruence and pedagogical coherence.…”
Section: Learning Object Repositoriesmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, the evaluation can be either quantitative or qualitative. In a later work, the authors detail the form as determining the compliance sub-characteristic (part of the functionality characteristic) through six indicators (Vidal et al, 2010). The indicators are metadata standardization, completeness, correctness, clarity, congruence and pedagogical coherence.…”
Section: Learning Object Repositoriesmentioning
confidence: 99%
“…The proposal made by Tabares et al (2013) defined the quality of the metadata by using completeness, consistency and coherence metrics. The first two metrics are similar to those proposed by Vidal et al (2010), but coherence is defined from the correlation between metadata, which should be drawn from a well-labelled significant set of objects. Coherence is defined as an narray relationship between the learning object metadata with a pedagogical sense.…”
Section: Learning Object Repositoriesmentioning
confidence: 99%
“…Es la institución propietaria del ROA quien debe definir el perfil de los expertos y el momento de la evaluación. En la tabla IV se presenta el instrumento propuesto con las variables evaluadas, las preguntas y su respectiva codificación, que servirá para entender el cálculo de las métricas, el cual se basa en propuestas de diferentes autores (Bruce y Hillmann, 2004;Massa, 2012;Morales et al, 2007;Nesbit et al, 2003;Vidal, Segura, Campos y Sánchez-Alonso, 2010).Modelo por capas para evaluación de la calidad de objetos de aprendizaje en repositorios En qué nivel la estructura y contenido apoyan el aprendizaje del tema. …”
unclassified
“…Besides the quality metrics defined by some of the above evaluation models, more metrics have been developed to evaluate various aspects of Learning Objects such as metadata quality [26], [288], popularity [219], [221], effectiveness [289], [290] or cost of reuse [192].…”
Section: Learning Object Evaluationmentioning
confidence: 99%
“…Vidal, Segura, Campos and Sánchez-Alonso [288] proposed a set of quality metrics for Learning Object metadata in order to evaluate metadata standardization, completeness, correctness, understandability, coherence and congruence. Some of these metrics can be calculated automatically by software systems while others require expert evaluations.…”
Section: Learning Object Evaluationmentioning
confidence: 99%