Purpose The purpose of this paper is to present conceptual definitions for digital object use and reuse. Typically, assessment of digital repository content struggles to go beyond traditional usage metrics such as clicks, views or downloads. This is problematic for galleries, libraries, archives, museums and repositories (GLAMR) practitioners because use assessment does not tell a nuanced story of how users engage with digital content and objects. Design/methodology/approach This paper reviews prior research and literature aimed at defining use and reuse of digital content in GLAMR contexts and builds off of this group’s previous research to devise a new model for defining use and reuse called the use-reuse matrix. Findings This paper presents the use-reuse matrix, which visually represents eight categories and numerous examples of use and reuse. Additionally, the paper explores the concept of “permeability” and its bearing on the matrix. It concludes with the next steps for future research and application in the development of the Digital Content Reuse Assessment Framework Toolkit (D-CRAFT). Practical implications The authors developed this model and definitions to inform D-CRAFT, an Institute of Museum and Library Services National Leadership Grant project. This toolkit is being developed to help practitioners assess reuse at their own institutions. Originality/value To the best of the authors’ knowledge, this paper is one of the first to propose distinct definitions that describe and differentiate between digital object use and reuse in the context of assessing digital collections and data.
While special collections and archives managers have at times recognized the importance of using data to drive decision making, translating this objective into reality and integrating data analysis into day-to-day operations has proven to be a significant challenge. There have also been obstacles to formulating quantitative metrics for special collections and archives and rendering them interoperable across institutional boundaries. This article attempts to focus a conversation around two issues: 1) the importance of quantitative analysis of operational data for improving research services in special collections and archives; and 2) the need for the profession to achieve consensus on definitions for . . .
Purpose – The purpose of this paper is to provide much needed data to staff working with archival digitization on cost and benefit of visual checks during quality control workflows, and to encourage those in the field of digitization to take a data-driven approach to planning and workflow development as they transition into large-scale digitization. Design/methodology/approach – This is a case study of a cost benefit analysis at the Triangle Research Libraries Network. Data were tracked on time spent performing visual checks compared to scanning production and error type/discovery rates for the consortial grant “Content, context, and capacity: a collaborative large-scale digitization project on the long civil rights movement in North Carolina”. Findings – Findings show that 85 percent of time was spent scanning and 15 percent was spent on quality control with visual checks of every scan. Only one error was discovered for every 223 scans reviewed (0.4 percent of scans). Of the six types of error identified, only half cause critical user experience issues. Of all errors detected, only 32 percent fell into the critical category. One critical error was found for every 700 scans (0.1 percent of scans). If all the time spent performing visual checks were instead spent on scanning, production would have increased by 18 percent. Folders with 100 or more scans comprised only 11.5 percent of all folders and 37 percent of folders in this group contained errors (for comparison, only 8 percent of folders with 50 or more scans contained errors). Additionally, 52 percent of all critical errors occurred in these folders. The errors in larger folders represented 30 percent of total errors, and performing visual checks on the large folders required 32 percent of all visual check time. Practical implications – The data gathered during this research can be repurposed by others wishing to consider or conduct cost benefit analysis of visual check workflows for large-scale digitization. Originality/value – To the authors' knowledge, this is the only available dataset on rate of error detection and error type compared to time spent on quality control visual checks in digitization.
PurposeThe purpose of this paper is to encourage administrators of device‐lending programs to leverage existing quantitative data for management purposes by integrating analysis of quantitative data into the day‐to‐day workflow.Design/methodology/approachThis is a case study of NCSU Libraries' efforts to analyze and visualize transactional data to aid in the on‐going management of a device‐lending program.FindingsAnalysis and visualization of qualitative data related to technology lending revealed patterns in lending over the course of the semester, day, and week that had previously gone unrecognized. With more concrete data about trends in wait times, capacity lending, and circulation volume, staff are now able to make more informed purchasing decisions, modify systems and workflows to better meet user needs, and begin to explore new ideas for services and staffing models.Practical implicationsThe concepts and processes described here can be replicated by other libraries that wish to leverage transactional data analysis and data visualization to aid in management of a device‐lending program.Originality/valueAlthough much literature exists on the implementation and qualitative evaluation of device‐lending programs, this paper is the first to provide librarians with ideas for leveraging analysis of transactional data to improve management of a device‐lending program.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.