Purpose – The purpose of this paper is to identify the ability of college freshmen to successfully use common academic software and manage files. Design/methodology/approach – In total, 39 college freshmen from three college campuses were recruited for the study. An online test environment and screen recording software were used to measure student proficiency in using PDFs, Microsoft Word, Microsoft PowerPoint, Microsoft Excel, Gmail, and Windows. Data were collected in September 2013. Findings – Student use of academic technology is common, but their software skills are not comprehensive or deep. Students were most proficient at using PDFs and Microsoft Word. Microsoft Excel tasks were the most difficult for the students, and many struggled to use Gmail to compose a message and send an attachment. Students were able to open a PowerPoint document and view a slideshow, but they were less comfortable navigating the software’s printing environment. Originality/value – Having concrete data about student technology skills, rather than anecdotal data from reference desk interactions, can help librarians design improved instruction and tutorials that target areas of student technology weakness.
Purpose -The purpose of this paper is to report on how researchers at Penn State University used video screen capture technology to learn more about student usage of the library's databases. Design/methodology/approach -The process of identifying and implementing a novel way to capture and analyze the human-computer interaction is outlined and discussed. Findings -Because of the drawbacks associated with both formal, direct physical observation of research participants and videotaping participants, video screen capture technology is chosen as a better way to track human-computer interaction. Practical implications -Video screen capture technology is an inexpensive, user-friendly way to enhance electronic resource usability studies in any library. Research files can be easily exported into coding software for data analysis. Originality/value -The paper examines a new, non-invasive way to capture student research behavior. It shows how any library could use this same technology to conduct research on how their resources are being used by their user population.
When asked to print the full text of an article, many undergraduate college students print the abstract instead of the full text. This study seeks to determine the underlying cause(s) of this confusion. In this quantitative study, participants (n=40) performed five usability tasks to assess ease of use and usefulness of five commercial library databases and were surveyed on their understanding of library terminology. The study revealed that more than half of the students correctly defined the term "Abstract" and over 75 percent understood "full text." However, only 25 percent of the students were able to successfully complete all five database tasks.ver the past 10 to 15 years, while working at the reference desk, the authors have come across hundreds of printouts of journal article abstracts lying unclaimed next to the library's printers. These orphan abstracts beg the question, "Do students know the difference between an abstract and the full text of an article before they hit the print button?" For all of the abstracts that were left unclaimed, the authors knew that many more were being given to professors as examples of journal articles and being used as sources for research papers. In a previous research study, the authors found that, of 39 students who had been exposed to the concepts of "abstract" and "full text" through library instruction, only 62 percent were able to find and print the full text of five articles related to their research assignment. The remaining 38 percent of students printed at least one abstract in place of the full text. 1The authors were interested in studying the underlying cause or causes of this disconnect. Did our students not know the difference between an abstract and the full text of an article, even when they had received instruction on these concepts? Or were the database results pages designed so that the full text of the article was too difficult to find? Or was there a combination of factors leading to the students' obvious confusion?
. © Bonnie Imler and Michelle EichelbergerThis study focuses on a quantitative assessment of undergraduate student use of SFX citation linking software. The study seeks to reveal whether or not students are using SFX, and, if so, how they are using it. Study participants were Penn State Altoona students, all of whom had received basic library instruction. Their research sessions were captured with screen capture software and then coded with Studiocode video coding technology. While not the prime focus of the study, the investigators also considered the effect of factors such as database Web design, images, and terminology on student usage of SFX.
Purpose The purpose of this paper is to examine the technology confidence, skills, and post-skills-test emotions in traditional (younger than 24 years old) and non-traditional (24 and older) first-year college students at three undergraduate campuses in the Northeastern USA. Design/methodology/approach Totally, 39 college freshmen from three college campuses were recruited for the study. An online test environment and screen recording software were used to measure student proficiency in using PDFs, Microsoft Word, Microsoft PowerPoint, Microsoft Excel, Gmail, and Windows. Data were collected in September 2013. Findings The majority of the students struggled with at least one facet of academic software. Traditional students were more confident than non-traditional students in their technology skills, but they did not score any higher on the skills test. Students who placed at the high end and low end of the test results curve most accurately assessed their technology skills, and their post-test feelings were the most appropriate in light of their test results. A large percentage of the traditional aged students were overconfident about their skills and self-identified as “happy” or “wonderful” even after performing poorly on the test. Originality/value Having concrete data about student technology skills, rather than anecdotal data from Reference Desk interactions, can help librarians design improved instruction and tutorials that target areas of student technology weakness. In addition, there have been no studies that examine student immediate emotional response to test performance in this type of testing environment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.