2017
DOI: 10.1108/el-09-2015-0179
|View full text |Cite
|
Sign up to set email alerts
|

All that glitters isn’t gold

Abstract: Purpose Digital collection assessment has focused mainly on evaluating systems, metadata and usability. While use evaluation is discussed in the literature, there are no standard criteria and methods for how to perform assessment on use effectively. This paper asserts that use statistics have complexities that prohibit meaningful interpretation and assessment. The authors aim to discover the problems inherent in the assessment of digital collection use statistics and propose solutions to address such issues. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 20 publications
(22 reference statements)
0
4
0
Order By: Relevance
“…Measuring the impact of a digital collection is challenging. (Perrin et al 2017) describe the complexities of understanding the data generated by Google Analytics of a DSpace environment. Other authors focus more on the stronger and weaker points of the library collection; an example is the work by (Hyödynmaa, Ahlholm-Kannisto, and Nurminen 2010) on collection mapping in Tampere University Library.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Measuring the impact of a digital collection is challenging. (Perrin et al 2017) describe the complexities of understanding the data generated by Google Analytics of a DSpace environment. Other authors focus more on the stronger and weaker points of the library collection; an example is the work by (Hyödynmaa, Ahlholm-Kannisto, and Nurminen 2010) on collection mapping in Tampere University Library.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Assessing use of digital objects tends to be quantitative and relatively straightforward; examples of digital object use assessment metrics include views, viewing time, clicks, downloads, sessions and other web analytics; printing; altmetrics (e.g. downloads, saves in citation managers, recommendations and shares); video coding and annotation; and clones of code or data (Baughman et al , 2018; Carpenter et al , 2016; Faniel and Yakel, 2011; Fouseki and Vacharopoulou, 2013; Frank et al , 2016; GitHub and Various Contributors, 2018; Ladd, 2015; He and Han, 2017; Perrin et al , 2017; White et al , 2018; O’Gara et al , 2018; Kelly et al , 2018).…”
Section: Literature Reviewmentioning
confidence: 99%
“…21 Perrin et al focused on five specific repository usage problems, including "difficulty of distinguishing different kinds of internet traffic," a "lack of direct correlation of a digital item to its multiple URLs," "the analytics tools' inherent bias in statistics that are counted only in the positive way," "the different interaction between digital collections with search engine indexing," and "evaluator's bias toward simple growing statistics over time for surmising a positive use assessment." 22 To overcome these issues, Perrin et al advocate for institutions to evaluate usage data through the lens of the "sessions or user perspective." 23 As noted by multiple studies, standardizing practices is a current gap in digital repository usage assessment.…”
Section: Literature Reviewmentioning
confidence: 99%
“…22 To overcome these issues, Perrin et al advocate for institutions to evaluate usage data through the lens of the "sessions or user perspective." 23 As noted by multiple studies, standardizing practices is a current gap in digital repository usage assessment. However, some groups are beginning to address this problem.…”
Section: Literature Reviewmentioning
confidence: 99%