A Review of: Ryan, Susan M. “Reference Transactions Analysis: The Cost-effectiveness of Staffing a Traditional Academic Reference Desk.” Journal of Academic Librarianship 34.5 (2008): 389-99. Objective – To determine whether it is cost effective to staff an academic reference desk with librarians through an examination of the types of reference questions being asked and the qualifications required to answer them. Design – Content analysis of reference transaction logs and activity-based costing for reference services based on quantitative data derived from the logs. Setting – Stetson University, a private institution in the United States with an FTE of approximately 2500. Subjects – 6959 phone, email, and in-person reference transactions logged at the reference desk by four full-time and two part-time librarians. Methods – This study repurposes data originally collected to determine the frequency with which librarians turned to online versus print sources when responding to questions at the reference desk. Librarians working at the Stetson University library reference desk recorded all reference queries received in person, by phone, or by email for a total of eight months between 2002 and 2006. Data collection took place in two month intervals in fall 2002, spring 2003, spring 2006, and fall 2006. Each question and the sources used to address it were logged by the librarian. Directional questions that were not related to the library’s collections and technical questions dealing with printer or copier mechanical problems were counted, but the specifics of these questions were not recorded. It was felt that these queries would not yield data relevant to the original research question on sources used as they “did not directly relate to an information need” (391). A total of 6959 questions were logged by librarians during the four collection periods. Questions were recorded for only 4431 transactions; the remaining 2528 queries related to printer/copier problems or non-library specific directions and were described as “direction and machine: non-informational” (394). The 4431 recorded questions were then divided into four categories derived by the researcher: look-up (a search for a known item), directional (library-specific orientation to the space and collections), technology (assistance with using library technology and electronic resources), and reference. The category of reference was further subdivided into eight additional categories: catalogue search, citation help, database help, “guide to correct databases,” “personal knowledge or referral,” “quick internet search,” research, and Serials Solutions (392). “Guide to correct databases” referred to advice on the appropriate database to answer a question and serials solutions included questions that could be answered using the Serials Solutions product, such as the availability of a particular journal or article in the collection (392). Questions were assigned to the single most appropriate category by the researcher. Question categories were then mapped to “suggested staffing levels” (396). This determination was made by the researcher, and no details were given as to how the decision was made for each category. The three levels of staffing discussed were librarian, “trained student or staff,” and “well-trained staff/occasional librarian referral” (396). The cost of staffing the reference desk during the eight months captured in this study was calculated by multiplying the hours worked by each librarian by his/her individual average rate of pay across the four data collection periods. Indirect staff costs such as benefits were not included in this calculation. The average cost per reference transaction was determined by dividing the total salary costs by the total number of reference queries during the periods of study. Costs for those categories of questions best addressed by a librarian could then be determined. The actual number of librarians who participated in the study is unclear. The methodology refers to four full-time and two part-time librarians (391). However, later in the article there is reference to five full-time and three part-time librarians rather than the numbers initially stated (396). This may reflect staffing changes during the study period, with the first set of numbers referring to positions rather than individuals, but this cannot be verified with the evidence presented in the article. Main Results – It was determined that most questions asked at the reference desk during the study period could have been addressed by trained student and staff member rather than librarians. Only 11% (784) of questions logged were deemed sufficiently complex by the researcher to require the attention of a librarian. The remaining 6175 transactions (89% of all those logged) could most likely be handled by a different staffing complement. According to Ryan, approximately 74% of the reference transactions, including directional, technology, “quick internet,” and known item searching questions could have been answered by “trained student and staff” (396). Questions on catalogue searching, databases, citations, Serial Solutions, and personal knowledge/referrals, representing approximately 15% of all questions, could have been handled by experienced and knowledgeable staff with limited librarian intervention. The complexity of the question was in part judged by the number of sources required to answer it, with most (75%) answerable with just one source. The total cost of staffing the reference desk with librarians for the eight months studied was approximately US$49,328.00. A total of 6959 questions were logged during this period, resulting in an average cost of US$7.09 per reference transaction. This cost is approximate, as the exact time spent on each question was not recorded. The cost of answering “non-informational” directional and technical questions was the most significant (396). This category represented 36.3% of all questions received at the reference desk, with a total staffing cost of $17, 919.41 ($7.09 x 2528). “Information-orientated” directional and technology questions followed at 15.4% (US$7,620) and 12.4% (US$6,110.18) respectively (396). According to Ryan, questions in all three categories could be addressed by students and staff. The cost of addressing research questions, the only category requiring librarians, was US$5557.29. Research transactions were greatly outnumbered by directional and technology related questions. An average of 3.6 research questions were asked at the reference desk during the 12 hours it was open each day, compared to 20.8 directional/technical questions. Conclusion – The nature of questions logged at the Stetson University library reference desk suggests that it is inefficient to staff the desk with librarians, given the salary costs of such a staffing model and the fact that librarian’s skills may not be required to answer most of the questions posed. Since the number of questions that need a librarian is so low, Ryan suggests that alternative staffing and service models be considered, so the energies of librarians could be more effectively employed elsewhere in the organization in areas such as information literacy instruction and the development of enhanced web services. It is noted that any reorganization of reference services should be done in concert with user surveys, consultation with staff, and extensive training to prepare staff for new roles. Suggested areas for further research identified by the researcher include the quality of reference transactions in an increasingly online environment.
the terms of the Creative Commons-Attribution-Noncommercial-Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one. AbstractObjective -To evaluate the efficacy of an ehealth literacy educational intervention aimed at older adults.
Many post-secondary institutions in Canada over the past decade have made the transition from college to university status. The researchers on this team were hired in the midst of such a transition at one western Canadian institution. As new faculty we were navigating the normal tides of adjusting to a new faculty position, but our induction occurred in a shifting institutional context. Our research question, “What is the new faculty experience in a transitional institution?” guided a five-year focused ethnography, beginning as a selfstudy of the research team and expanding into 60 interviews with 31 participants over several years. The results demonstrate that a more complex theory is required to reflect the experience of new faculty than has appeared previously in the literature. We propose a framework of competing discourses.
Objective – To outline the collaborative development of an online course addressing academic integrity by a university’s library system and writing centre. Design – Case study. Setting – A public research university in the Midwestern United States. Subjects – 1650 students who completed the online module. Methods – Oakland University (OU) Libraries and the Writing Centre began to collaborate on the development of a new online course on academic integrity in 2011. It was felt that an existing online library tutorial on plagiarism no longer met the needs of students and faculty. The development of the course was informed by the Association of College and Research Libraries’ Information Literacy Competency Standards for Higher Education (2000) as well as a research study investigating students’ use of sources in their scholarly writing across several institutions. Moodle, the institution’s learning management system (LMS), was used to develop the learning object. Main Results – OU Libraries and the Writing Centre launched the six-part online course entitled “Using and Citing Sources” in January 2012. They developed modules around learning outcomes in five broad categories: defining academic integrity and plagiarism; the use of sources in academic writing; paraphrasing; quoting; and citation. The final module provided students with an opportunity to practise lessons learned in the first five modules. The use of the LMS to design and host the course limited the tutorial to registered students, but provided developers with access to additional course functionality without labour-intensive coding. It also allowed Writing Centre staff to access students’ performance data on the modules prior to their appointments. Improvements over the previous online tutorial included expanded content on academic ethics and referencing, more active learning elements, video content, and the opportunity for students to choose discipline-specific examples. In the first four months of its availability, 1650 students completed the course, with 3330 attempts overall. Conclusion – The diverse perspectives and expertise that individuals from OU Libraries and the Writing Center brought to their collaboration greatly informed the development of the course. The time and effort saved by using the university’s existing LMS to develop interactive content and the focus on providing students with opportunities for active learning within the course contributed to the project’s success.
Objective -To determine the extent to which age and cognitive style influence an individual's ability to successfully locate health information online.Design -Quantitative study employing scales and regression analysis.Setting -A school of public health and a school of medicine at two universities in the United States of America.Subjects -346 men and women 35 years or older.Methods -Participants for the Online Health Study (OHS) were recruited from the community at both study sites using stratified sampling and screened with a web-based tool to ensure they had the necessary level of digital literacy to complete the study tasks. Once enrolled, participants completed the Rapid Estimate of Adult Literacy in Medicine (REALM) to measure their health literacy and the Witkin Group Embedded Figures Test (GEFT) to determine their cognitive style (labelled by researchers as context sensitive or context independent). Participants were asked to search online for answers to six specific questions on heart-healthy diets, flu vaccinations, alternative medicine and memory, genetic testing, assistive medical technology, and skin cancer, with 15 minutes of search time allowed for each question. Participants reported their answers after each search, which were later assigned scores for
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.